Apr 28 19:16:55.369452 ip-10-0-141-41 systemd[1]: Starting Kubernetes Kubelet... Apr 28 19:16:55.847265 ip-10-0-141-41 kubenswrapper[2565]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:16:55.847265 ip-10-0-141-41 kubenswrapper[2565]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 28 19:16:55.847265 ip-10-0-141-41 kubenswrapper[2565]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:16:55.847265 ip-10-0-141-41 kubenswrapper[2565]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 28 19:16:55.847265 ip-10-0-141-41 kubenswrapper[2565]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:16:55.850704 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.850578 2565 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 28 19:16:55.858922 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.858900 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:55.858922 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.858918 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:55.858922 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.858923 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:55.858922 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.858927 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:55.858922 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.858931 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:55.859229 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.858935 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:55.859229 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.858940 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:55.859229 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.858943 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:55.859229 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.858947 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:55.859229 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.858951 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:55.859229 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.858955 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:55.859229 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.858958 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:55.859229 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.858963 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:55.859229 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.858967 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:55.859229 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.858971 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:55.859229 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.858975 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:55.859229 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.858978 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:55.859229 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.858982 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:55.859229 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.858989 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:55.859229 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.858996 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:55.859229 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859001 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:55.859229 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859005 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:55.859229 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859009 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:55.859229 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859014 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:55.859971 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859018 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:55.859971 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859032 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:55.859971 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859037 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:55.859971 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859041 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:55.859971 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859046 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:55.859971 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859051 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:55.859971 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859056 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:55.859971 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859060 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:55.859971 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859065 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:55.859971 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859069 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:55.859971 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859073 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:55.859971 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859078 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:55.859971 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859082 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:55.859971 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859086 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:55.859971 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859092 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:55.859971 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859097 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:55.859971 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859103 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:55.859971 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859107 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:55.859971 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859111 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:55.859971 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859115 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:55.860710 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859120 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:55.860710 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859124 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:55.860710 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859127 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:55.860710 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859132 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:55.860710 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859136 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:55.860710 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859140 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:55.860710 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859145 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:55.860710 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859149 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:55.860710 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859153 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:55.860710 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859173 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:55.860710 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859179 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:55.860710 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859183 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:55.860710 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859187 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:55.860710 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859192 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:55.860710 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859195 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:55.860710 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859201 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:55.860710 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859206 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:55.860710 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859210 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:55.860710 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859214 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:55.860710 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859220 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:55.861253 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859224 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:55.861253 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859228 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:55.861253 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859232 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:55.861253 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859236 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:55.861253 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859241 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:55.861253 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859245 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:55.861253 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859250 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:55.861253 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859255 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:55.861253 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859259 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:55.861253 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859264 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:55.861253 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859268 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:55.861253 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859272 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:55.861253 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859280 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:55.861253 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859286 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:55.861253 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859291 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:55.861253 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859296 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:55.861253 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859301 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:55.861253 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859305 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:55.861253 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859310 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:55.861713 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859315 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:55.861713 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859319 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:55.861713 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859323 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:55.861713 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859967 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:55.861713 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859976 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:55.861713 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859981 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:55.861713 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859985 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:55.861713 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859989 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:55.861713 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859993 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:55.861713 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.859997 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:55.861713 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860001 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:55.861713 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860006 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:55.861713 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860010 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:55.861713 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860015 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:55.861713 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860020 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:55.861713 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860024 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:55.861713 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860031 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:55.861713 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860037 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:55.861713 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860042 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:55.862303 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860048 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:55.862303 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860052 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:55.862303 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860057 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:55.862303 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860061 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:55.862303 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860066 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:55.862303 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860071 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:55.862303 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860075 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:55.862303 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860079 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:55.862303 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860084 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:55.862303 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860089 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:55.862303 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860093 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:55.862303 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860097 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:55.862303 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860102 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:55.862303 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860106 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:55.862303 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860110 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:55.862303 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860114 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:55.862303 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860119 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:55.862303 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860122 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:55.862303 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860127 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:55.862303 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860132 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:55.863180 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860137 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:55.863180 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860142 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:55.863180 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860146 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:55.863180 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860151 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:55.863180 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860155 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:55.863180 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860180 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:55.863180 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860184 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:55.863180 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860189 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:55.863180 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860193 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:55.863180 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860197 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:55.863180 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860202 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:55.863180 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860206 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:55.863180 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860211 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:55.863180 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860215 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:55.863180 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860220 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:55.863180 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860224 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:55.863180 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860228 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:55.863180 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860232 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:55.863180 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860236 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:55.863180 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860241 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:55.863833 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860245 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:55.863833 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860251 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:55.863833 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860256 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:55.863833 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860261 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:55.863833 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860265 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:55.863833 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860269 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:55.863833 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860273 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:55.863833 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860279 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:55.863833 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860283 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:55.863833 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860290 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:55.863833 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860294 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:55.863833 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860298 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:55.863833 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860303 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:55.863833 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860307 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:55.863833 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860311 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:55.863833 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860315 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:55.863833 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860319 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:55.863833 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860323 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:55.863833 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860328 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:55.864372 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860332 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:55.864372 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860336 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:55.864372 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860340 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:55.864372 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860344 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:55.864372 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860348 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:55.864372 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860353 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:55.864372 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860357 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:55.864372 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860361 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:55.864372 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860365 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:55.864372 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860370 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:55.864372 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.860374 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:55.864372 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862037 2565 flags.go:64] FLAG: --address="0.0.0.0" Apr 28 19:16:55.864372 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862054 2565 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 28 19:16:55.864372 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862065 2565 flags.go:64] FLAG: --anonymous-auth="true" Apr 28 19:16:55.864372 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862071 2565 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 28 19:16:55.864372 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862080 2565 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 28 19:16:55.864372 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862086 2565 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 28 19:16:55.864372 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862093 2565 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 28 19:16:55.864372 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862100 2565 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 28 19:16:55.864372 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862106 2565 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 28 19:16:55.864372 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862111 2565 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 28 19:16:55.864927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862117 2565 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 28 19:16:55.864927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862123 2565 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 28 19:16:55.864927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862128 2565 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 28 19:16:55.864927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862133 2565 flags.go:64] FLAG: --cgroup-root="" Apr 28 19:16:55.864927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862138 2565 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 28 19:16:55.864927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862142 2565 flags.go:64] FLAG: --client-ca-file="" Apr 28 19:16:55.864927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862147 2565 flags.go:64] FLAG: --cloud-config="" Apr 28 19:16:55.864927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862152 2565 flags.go:64] FLAG: --cloud-provider="external" Apr 28 19:16:55.864927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862157 2565 flags.go:64] FLAG: --cluster-dns="[]" Apr 28 19:16:55.864927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862180 2565 flags.go:64] FLAG: --cluster-domain="" Apr 28 19:16:55.864927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862184 2565 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 28 19:16:55.864927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862190 2565 flags.go:64] FLAG: --config-dir="" Apr 28 19:16:55.864927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862195 2565 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 28 19:16:55.864927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862200 2565 flags.go:64] FLAG: --container-log-max-files="5" Apr 28 19:16:55.864927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862206 2565 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 28 19:16:55.864927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862211 2565 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 28 19:16:55.864927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862216 2565 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 28 19:16:55.864927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862222 2565 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 28 19:16:55.864927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862227 2565 flags.go:64] FLAG: --contention-profiling="false" Apr 28 19:16:55.864927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862231 2565 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 28 19:16:55.864927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862236 2565 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 28 19:16:55.864927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862241 2565 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 28 19:16:55.864927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862246 2565 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 28 19:16:55.864927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862253 2565 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 28 19:16:55.864927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862258 2565 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 28 19:16:55.865812 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862262 2565 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 28 19:16:55.865812 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862267 2565 flags.go:64] FLAG: --enable-load-reader="false" Apr 28 19:16:55.865812 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862273 2565 flags.go:64] FLAG: --enable-server="true" Apr 28 19:16:55.865812 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862277 2565 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 28 19:16:55.865812 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862285 2565 flags.go:64] FLAG: --event-burst="100" Apr 28 19:16:55.865812 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862290 2565 flags.go:64] FLAG: --event-qps="50" Apr 28 19:16:55.865812 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862294 2565 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 28 19:16:55.865812 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862299 2565 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 28 19:16:55.865812 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862304 2565 flags.go:64] FLAG: --eviction-hard="" Apr 28 19:16:55.865812 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862311 2565 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 28 19:16:55.865812 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862315 2565 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 28 19:16:55.865812 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862320 2565 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 28 19:16:55.865812 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862325 2565 flags.go:64] FLAG: --eviction-soft="" Apr 28 19:16:55.865812 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862330 2565 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 28 19:16:55.865812 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862336 2565 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 28 19:16:55.865812 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862341 2565 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 28 19:16:55.865812 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862346 2565 flags.go:64] FLAG: --experimental-mounter-path="" Apr 28 19:16:55.865812 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862351 2565 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 28 19:16:55.865812 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862356 2565 flags.go:64] FLAG: --fail-swap-on="true" Apr 28 19:16:55.865812 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862360 2565 flags.go:64] FLAG: --feature-gates="" Apr 28 19:16:55.865812 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862366 2565 flags.go:64] FLAG: --file-check-frequency="20s" Apr 28 19:16:55.865812 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862371 2565 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 28 19:16:55.865812 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862376 2565 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 28 19:16:55.865812 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862382 2565 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 28 19:16:55.865812 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862387 2565 flags.go:64] FLAG: --healthz-port="10248" Apr 28 19:16:55.865812 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862392 2565 flags.go:64] FLAG: --help="false" Apr 28 19:16:55.866487 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862397 2565 flags.go:64] FLAG: --hostname-override="ip-10-0-141-41.ec2.internal" Apr 28 19:16:55.866487 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862402 2565 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 28 19:16:55.866487 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862407 2565 flags.go:64] FLAG: --http-check-frequency="20s" Apr 28 19:16:55.866487 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862411 2565 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 28 19:16:55.866487 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862417 2565 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 28 19:16:55.866487 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862423 2565 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 28 19:16:55.866487 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862428 2565 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 28 19:16:55.866487 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862433 2565 flags.go:64] FLAG: --image-service-endpoint="" Apr 28 19:16:55.866487 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862437 2565 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 28 19:16:55.866487 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862464 2565 flags.go:64] FLAG: --kube-api-burst="100" Apr 28 19:16:55.866487 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862470 2565 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 28 19:16:55.866487 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862475 2565 flags.go:64] FLAG: --kube-api-qps="50" Apr 28 19:16:55.866487 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862480 2565 flags.go:64] FLAG: --kube-reserved="" Apr 28 19:16:55.866487 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862485 2565 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 28 19:16:55.866487 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862490 2565 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 28 19:16:55.866487 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862495 2565 flags.go:64] FLAG: --kubelet-cgroups="" Apr 28 19:16:55.866487 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862500 2565 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 28 19:16:55.866487 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862505 2565 flags.go:64] FLAG: --lock-file="" Apr 28 19:16:55.866487 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862509 2565 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 28 19:16:55.866487 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862515 2565 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 28 19:16:55.866487 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862520 2565 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 28 19:16:55.866487 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862541 2565 flags.go:64] FLAG: --log-json-split-stream="false" Apr 28 19:16:55.866487 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862546 2565 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 28 19:16:55.867154 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862551 2565 flags.go:64] FLAG: --log-text-split-stream="false" Apr 28 19:16:55.867154 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862555 2565 flags.go:64] FLAG: --logging-format="text" Apr 28 19:16:55.867154 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862560 2565 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 28 19:16:55.867154 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862566 2565 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 28 19:16:55.867154 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862571 2565 flags.go:64] FLAG: --manifest-url="" Apr 28 19:16:55.867154 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862576 2565 flags.go:64] FLAG: --manifest-url-header="" Apr 28 19:16:55.867154 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862583 2565 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 28 19:16:55.867154 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862588 2565 flags.go:64] FLAG: --max-open-files="1000000" Apr 28 19:16:55.867154 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862595 2565 flags.go:64] FLAG: --max-pods="110" Apr 28 19:16:55.867154 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862600 2565 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 28 19:16:55.867154 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862604 2565 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 28 19:16:55.867154 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862609 2565 flags.go:64] FLAG: --memory-manager-policy="None" Apr 28 19:16:55.867154 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862614 2565 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 28 19:16:55.867154 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862618 2565 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 28 19:16:55.867154 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862626 2565 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 28 19:16:55.867154 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862631 2565 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 28 19:16:55.867154 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862645 2565 flags.go:64] FLAG: --node-status-max-images="50" Apr 28 19:16:55.867154 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862649 2565 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 28 19:16:55.867154 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862654 2565 flags.go:64] FLAG: --oom-score-adj="-999" Apr 28 19:16:55.867154 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862660 2565 flags.go:64] FLAG: --pod-cidr="" Apr 28 19:16:55.867154 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862666 2565 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 28 19:16:55.867154 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862673 2565 flags.go:64] FLAG: --pod-manifest-path="" Apr 28 19:16:55.867154 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862678 2565 flags.go:64] FLAG: --pod-max-pids="-1" Apr 28 19:16:55.867154 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862683 2565 flags.go:64] FLAG: --pods-per-core="0" Apr 28 19:16:55.867755 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862688 2565 flags.go:64] FLAG: --port="10250" Apr 28 19:16:55.867755 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862693 2565 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 28 19:16:55.867755 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862698 2565 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-05ecfc95fe4dd754e" Apr 28 19:16:55.867755 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862703 2565 flags.go:64] FLAG: --qos-reserved="" Apr 28 19:16:55.867755 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862708 2565 flags.go:64] FLAG: --read-only-port="10255" Apr 28 19:16:55.867755 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862713 2565 flags.go:64] FLAG: --register-node="true" Apr 28 19:16:55.867755 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862717 2565 flags.go:64] FLAG: --register-schedulable="true" Apr 28 19:16:55.867755 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862722 2565 flags.go:64] FLAG: --register-with-taints="" Apr 28 19:16:55.867755 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862729 2565 flags.go:64] FLAG: --registry-burst="10" Apr 28 19:16:55.867755 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862733 2565 flags.go:64] FLAG: --registry-qps="5" Apr 28 19:16:55.867755 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862738 2565 flags.go:64] FLAG: --reserved-cpus="" Apr 28 19:16:55.867755 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862743 2565 flags.go:64] FLAG: --reserved-memory="" Apr 28 19:16:55.867755 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862749 2565 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 28 19:16:55.867755 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862754 2565 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 28 19:16:55.867755 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862759 2565 flags.go:64] FLAG: --rotate-certificates="false" Apr 28 19:16:55.867755 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862764 2565 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 28 19:16:55.867755 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862769 2565 flags.go:64] FLAG: --runonce="false" Apr 28 19:16:55.867755 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862774 2565 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 28 19:16:55.867755 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862779 2565 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 28 19:16:55.867755 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862784 2565 flags.go:64] FLAG: --seccomp-default="false" Apr 28 19:16:55.867755 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862789 2565 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 28 19:16:55.867755 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862793 2565 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 28 19:16:55.867755 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862800 2565 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 28 19:16:55.867755 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862806 2565 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 28 19:16:55.867755 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862811 2565 flags.go:64] FLAG: --storage-driver-password="root" Apr 28 19:16:55.867755 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862816 2565 flags.go:64] FLAG: --storage-driver-secure="false" Apr 28 19:16:55.868450 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862820 2565 flags.go:64] FLAG: --storage-driver-table="stats" Apr 28 19:16:55.868450 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862825 2565 flags.go:64] FLAG: --storage-driver-user="root" Apr 28 19:16:55.868450 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862830 2565 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 28 19:16:55.868450 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862835 2565 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 28 19:16:55.868450 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862840 2565 flags.go:64] FLAG: --system-cgroups="" Apr 28 19:16:55.868450 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862845 2565 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 28 19:16:55.868450 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862856 2565 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 28 19:16:55.868450 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862861 2565 flags.go:64] FLAG: --tls-cert-file="" Apr 28 19:16:55.868450 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862866 2565 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 28 19:16:55.868450 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862873 2565 flags.go:64] FLAG: --tls-min-version="" Apr 28 19:16:55.868450 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862878 2565 flags.go:64] FLAG: --tls-private-key-file="" Apr 28 19:16:55.868450 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862882 2565 flags.go:64] FLAG: --topology-manager-policy="none" Apr 28 19:16:55.868450 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862887 2565 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 28 19:16:55.868450 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862893 2565 flags.go:64] FLAG: --topology-manager-scope="container" Apr 28 19:16:55.868450 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862897 2565 flags.go:64] FLAG: --v="2" Apr 28 19:16:55.868450 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862904 2565 flags.go:64] FLAG: --version="false" Apr 28 19:16:55.868450 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862911 2565 flags.go:64] FLAG: --vmodule="" Apr 28 19:16:55.868450 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862917 2565 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 28 19:16:55.868450 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.862923 2565 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 28 19:16:55.868450 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863081 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:55.868450 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863089 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:55.868450 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863094 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:55.868450 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863099 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:55.868450 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863104 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:55.869021 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863109 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:55.869021 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863113 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:55.869021 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863118 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:55.869021 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863121 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:55.869021 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863127 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:55.869021 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863132 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:55.869021 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863137 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:55.869021 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863141 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:55.869021 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863145 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:55.869021 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863149 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:55.869021 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863154 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:55.869021 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863177 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:55.869021 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863182 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:55.869021 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863186 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:55.869021 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863192 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:55.869021 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863196 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:55.869021 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863200 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:55.869021 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863204 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:55.869021 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863209 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:55.869021 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863213 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:55.869585 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863217 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:55.869585 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863222 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:55.869585 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863227 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:55.869585 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863232 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:55.869585 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863237 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:55.869585 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863241 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:55.869585 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863245 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:55.869585 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863249 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:55.869585 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863254 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:55.869585 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863258 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:55.869585 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863262 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:55.869585 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863266 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:55.869585 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863270 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:55.869585 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863274 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:55.869585 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863279 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:55.869585 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863284 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:55.869585 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863289 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:55.869585 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863294 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:55.869585 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863298 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:55.869585 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863302 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:55.870135 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863306 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:55.870135 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863310 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:55.870135 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863315 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:55.870135 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863319 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:55.870135 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863325 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:55.870135 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863329 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:55.870135 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863335 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:55.870135 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863340 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:55.870135 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863343 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:55.870135 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863348 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:55.870135 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863352 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:55.870135 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863356 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:55.870135 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863361 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:55.870135 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863365 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:55.870135 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863371 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:55.870135 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863377 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:55.870135 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863382 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:55.870135 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863387 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:55.870135 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863391 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:55.870635 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863396 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:55.870635 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863401 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:55.870635 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863405 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:55.870635 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863409 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:55.870635 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863413 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:55.870635 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863417 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:55.870635 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863421 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:55.870635 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863425 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:55.870635 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863429 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:55.870635 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863436 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:55.870635 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863440 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:55.870635 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863444 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:55.870635 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863449 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:55.870635 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863453 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:55.870635 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863458 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:55.870635 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863462 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:55.870635 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863466 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:55.870635 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863470 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:55.870635 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863476 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:55.870635 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863483 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:55.871132 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863487 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:55.871132 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.863492 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:55.871132 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.864252 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:16:55.871386 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.871366 2565 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 28 19:16:55.871420 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.871387 2565 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 28 19:16:55.871450 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871437 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:55.871450 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871443 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:55.871450 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871446 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:55.871450 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871449 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:55.871450 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871452 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:55.871574 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871455 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:55.871574 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871458 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:55.871574 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871460 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:55.871574 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871464 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:55.871574 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871467 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:55.871574 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871469 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:55.871574 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871472 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:55.871574 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871474 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:55.871574 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871477 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:55.871574 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871480 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:55.871574 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871483 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:55.871574 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871485 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:55.871574 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871488 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:55.871574 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871490 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:55.871574 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871494 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:55.871574 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871496 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:55.871574 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871500 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:55.871574 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871502 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:55.871574 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871505 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:55.871574 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871507 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:55.872051 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871512 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:55.872051 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871516 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:55.872051 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871519 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:55.872051 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871522 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:55.872051 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871525 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:55.872051 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871527 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:55.872051 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871534 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:55.872051 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871537 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:55.872051 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871539 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:55.872051 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871542 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:55.872051 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871544 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:55.872051 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871547 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:55.872051 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871550 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:55.872051 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871554 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:55.872051 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871556 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:55.872051 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871559 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:55.872051 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871562 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:55.872051 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871564 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:55.872051 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871567 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:55.872547 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871570 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:55.872547 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871572 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:55.872547 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871575 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:55.872547 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871577 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:55.872547 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871580 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:55.872547 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871583 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:55.872547 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871586 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:55.872547 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871589 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:55.872547 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871593 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:55.872547 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871596 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:55.872547 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871599 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:55.872547 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871601 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:55.872547 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871604 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:55.872547 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871606 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:55.872547 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871609 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:55.872547 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871612 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:55.872547 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871614 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:55.872547 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871617 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:55.872547 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871619 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:55.873052 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871622 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:55.873052 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871628 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:55.873052 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871631 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:55.873052 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871634 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:55.873052 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871636 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:55.873052 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871639 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:55.873052 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871642 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:55.873052 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871644 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:55.873052 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871647 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:55.873052 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871649 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:55.873052 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871652 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:55.873052 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871655 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:55.873052 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871657 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:55.873052 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871659 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:55.873052 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871662 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:55.873052 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871664 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:55.873052 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871667 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:55.873052 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871669 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:55.873052 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871672 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:55.873052 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871674 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:55.873608 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871677 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:55.873608 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871680 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:55.873608 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871682 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:55.873608 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.871687 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:16:55.873608 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871797 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:55.873608 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871801 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:55.873608 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871804 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:55.873608 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871807 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:55.873608 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871810 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:55.873608 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871812 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:55.873608 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871815 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:55.873608 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871817 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:55.873608 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871820 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:55.873608 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871822 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:55.873608 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871826 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:55.873608 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871829 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:55.874011 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871832 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:55.874011 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871834 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:55.874011 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871837 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:55.874011 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871840 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:55.874011 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871843 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:55.874011 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871846 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:55.874011 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871848 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:55.874011 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871851 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:55.874011 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871853 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:55.874011 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871855 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:55.874011 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871858 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:55.874011 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871861 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:55.874011 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871863 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:55.874011 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871866 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:55.874011 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871868 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:55.874011 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871871 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:55.874011 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871874 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:55.874011 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871876 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:55.874011 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871878 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:55.874492 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871881 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:55.874492 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871883 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:55.874492 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871886 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:55.874492 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871888 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:55.874492 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871891 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:55.874492 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871893 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:55.874492 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871896 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:55.874492 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871899 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:55.874492 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871901 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:55.874492 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871903 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:55.874492 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871906 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:55.874492 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871909 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:55.874492 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871912 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:55.874492 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871915 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:55.874492 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871917 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:55.874492 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871920 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:55.874492 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871923 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:55.874492 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871925 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:55.874492 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871928 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:55.874492 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871930 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:55.874968 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871932 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:55.874968 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871935 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:55.874968 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871937 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:55.874968 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871940 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:55.874968 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871944 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:55.874968 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871948 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:55.874968 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871951 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:55.874968 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871954 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:55.874968 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871956 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:55.874968 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871959 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:55.874968 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871961 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:55.874968 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871964 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:55.874968 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871967 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:55.874968 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871969 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:55.874968 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871971 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:55.874968 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871974 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:55.874968 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871976 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:55.874968 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871979 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:55.874968 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871981 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:55.874968 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871984 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:55.875477 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871986 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:55.875477 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871989 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:55.875477 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871992 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:55.875477 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871995 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:55.875477 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.871998 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:55.875477 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.872004 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:55.875477 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.872006 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:55.875477 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.872009 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:55.875477 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.872012 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:55.875477 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.872014 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:55.875477 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.872017 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:55.875477 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.872019 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:55.875477 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.872022 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:55.875477 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.872024 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:55.875477 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:55.872027 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:55.875836 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.872032 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:16:55.875836 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.872752 2565 server.go:962] "Client rotation is on, will bootstrap in background" Apr 28 19:16:55.875836 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.874858 2565 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 28 19:16:55.876172 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.876145 2565 server.go:1019] "Starting client certificate rotation" Apr 28 19:16:55.876266 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.876250 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 28 19:16:55.876304 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.876295 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 28 19:16:55.903436 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.903415 2565 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 28 19:16:55.910120 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.910026 2565 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 28 19:16:55.922307 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.922287 2565 log.go:25] "Validated CRI v1 runtime API" Apr 28 19:16:55.928364 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.928343 2565 log.go:25] "Validated CRI v1 image API" Apr 28 19:16:55.929606 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.929580 2565 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 28 19:16:55.934361 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.934341 2565 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 f2cbf017-0cda-4c19-a880-45a83b0580a1:/dev/nvme0n1p4 f7a64f2a-b569-49bb-8985-49057e43de71:/dev/nvme0n1p3] Apr 28 19:16:55.934431 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.934361 2565 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 28 19:16:55.940334 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.940222 2565 manager.go:217] Machine: {Timestamp:2026-04-28 19:16:55.938055604 +0000 UTC m=+0.443772262 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100998 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec22fa99d2104e4e90611222d77e0233 SystemUUID:ec22fa99-d210-4e4e-9061-1222d77e0233 BootID:cee429c6-9c1e-4be6-b0c4-1873064a1c9d Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:e6:83:fa:74:4b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:e6:83:fa:74:4b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:c2:96:f6:3d:29:d7 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 28 19:16:55.940843 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.940833 2565 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 28 19:16:55.940933 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.940921 2565 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 28 19:16:55.942551 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.942526 2565 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 28 19:16:55.942688 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.942553 2565 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-41.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 28 19:16:55.942736 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.942697 2565 topology_manager.go:138] "Creating topology manager with none policy" Apr 28 19:16:55.942736 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.942705 2565 container_manager_linux.go:306] "Creating device plugin manager" Apr 28 19:16:55.942736 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.942718 2565 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 28 19:16:55.943891 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.943880 2565 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 28 19:16:55.945285 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.945274 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 28 19:16:55.945435 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.945427 2565 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 28 19:16:55.948598 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.948585 2565 kubelet.go:491] "Attempting to sync node with API server" Apr 28 19:16:55.948642 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.948602 2565 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 28 19:16:55.948642 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.948615 2565 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 28 19:16:55.948642 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.948624 2565 kubelet.go:397] "Adding apiserver pod source" Apr 28 19:16:55.948642 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.948633 2565 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 28 19:16:55.949850 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.949838 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 28 19:16:55.949895 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.949857 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 28 19:16:55.953103 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.953085 2565 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 28 19:16:55.955016 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.955002 2565 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 28 19:16:55.956373 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.956348 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 28 19:16:55.956373 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.956376 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 28 19:16:55.956514 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.956387 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 28 19:16:55.956514 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.956396 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 28 19:16:55.956514 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.956404 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 28 19:16:55.956514 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.956419 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 28 19:16:55.956514 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.956428 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 28 19:16:55.956514 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.956438 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 28 19:16:55.956514 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.956450 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 28 19:16:55.956514 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.956458 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 28 19:16:55.956514 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.956466 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 28 19:16:55.956514 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.956476 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 28 19:16:55.957383 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.957369 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 28 19:16:55.957383 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.957381 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 28 19:16:55.960910 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.960897 2565 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 28 19:16:55.960986 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.960975 2565 server.go:1295] "Started kubelet" Apr 28 19:16:55.961055 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.961033 2565 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 28 19:16:55.962922 ip-10-0-141-41 systemd[1]: Started Kubernetes Kubelet. Apr 28 19:16:55.963498 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.963356 2565 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 28 19:16:55.963556 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.963510 2565 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 28 19:16:55.964857 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.964839 2565 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 28 19:16:55.966117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.966099 2565 server.go:317] "Adding debug handlers to kubelet server" Apr 28 19:16:55.971584 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.971548 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 28 19:16:55.973072 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:55.973056 2565 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 28 19:16:55.973187 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.973110 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 28 19:16:55.973664 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.973628 2565 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 28 19:16:55.974216 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:55.972999 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-41.ec2.internal.18aa9b5a80c613a5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-41.ec2.internal,UID:ip-10-0-141-41.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-141-41.ec2.internal,},FirstTimestamp:2026-04-28 19:16:55.960908709 +0000 UTC m=+0.466625365,LastTimestamp:2026-04-28 19:16:55.960908709 +0000 UTC m=+0.466625365,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-41.ec2.internal,}" Apr 28 19:16:55.974505 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.974490 2565 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 28 19:16:55.974505 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.974506 2565 factory.go:55] Registering systemd factory Apr 28 19:16:55.974611 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.974513 2565 factory.go:223] Registration of the systemd container factory successfully Apr 28 19:16:55.974611 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.974530 2565 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 28 19:16:55.974611 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.974533 2565 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 28 19:16:55.974611 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.974556 2565 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 28 19:16:55.974779 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.974633 2565 reconstruct.go:97] "Volume reconstruction finished" Apr 28 19:16:55.974779 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.974644 2565 reconciler.go:26] "Reconciler: start to sync state" Apr 28 19:16:55.974779 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.974758 2565 factory.go:153] Registering CRI-O factory Apr 28 19:16:55.974779 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.974773 2565 factory.go:223] Registration of the crio container factory successfully Apr 28 19:16:55.974908 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.974794 2565 factory.go:103] Registering Raw factory Apr 28 19:16:55.974908 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.974807 2565 manager.go:1196] Started watching for new ooms in manager Apr 28 19:16:55.975150 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:55.975128 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-41.ec2.internal\" not found" Apr 28 19:16:55.975584 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.975569 2565 manager.go:319] Starting recovery of all containers Apr 28 19:16:55.985781 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.985760 2565 manager.go:324] Recovery completed Apr 28 19:16:55.986348 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:55.986291 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 28 19:16:55.986496 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.986456 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-41.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 28 19:16:55.986613 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:55.986587 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-41.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 28 19:16:55.986894 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:55.986865 2565 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-141-41.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 28 19:16:55.987411 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:55.987380 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 28 19:16:55.990155 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.990131 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-l84jn" Apr 28 19:16:55.992013 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.991998 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:55.996198 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.996183 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-41.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:55.996268 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.996211 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-41.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:55.996268 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.996224 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-41.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:55.996748 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.996731 2565 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 28 19:16:55.996748 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.996746 2565 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 28 19:16:55.996851 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.996764 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 28 19:16:55.998649 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:55.998589 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-41.ec2.internal.18aa9b5a82e088a9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-41.ec2.internal,UID:ip-10-0-141-41.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-141-41.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-141-41.ec2.internal,},FirstTimestamp:2026-04-28 19:16:55.996197033 +0000 UTC m=+0.501913690,LastTimestamp:2026-04-28 19:16:55.996197033 +0000 UTC m=+0.501913690,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-41.ec2.internal,}" Apr 28 19:16:55.999482 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.999468 2565 policy_none.go:49] "None policy: Start" Apr 28 19:16:55.999555 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.999487 2565 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 28 19:16:55.999555 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:55.999500 2565 state_mem.go:35] "Initializing new in-memory state store" Apr 28 19:16:56.007881 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:56.007729 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-41.ec2.internal.18aa9b5a82e0db36 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-41.ec2.internal,UID:ip-10-0-141-41.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-141-41.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-141-41.ec2.internal,},FirstTimestamp:2026-04-28 19:16:55.996218166 +0000 UTC m=+0.501934823,LastTimestamp:2026-04-28 19:16:55.996218166 +0000 UTC m=+0.501934823,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-41.ec2.internal,}" Apr 28 19:16:56.013382 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.013366 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-l84jn" Apr 28 19:16:56.046711 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.031544 2565 manager.go:341] "Starting Device Plugin manager" Apr 28 19:16:56.046711 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:56.031588 2565 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 28 19:16:56.046711 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.031601 2565 server.go:85] "Starting device plugin registration server" Apr 28 19:16:56.046711 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.031792 2565 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 28 19:16:56.046711 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.031801 2565 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 28 19:16:56.046711 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.031926 2565 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 28 19:16:56.046711 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.031995 2565 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 28 19:16:56.046711 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.032003 2565 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 28 19:16:56.046711 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:56.032554 2565 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 28 19:16:56.046711 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:56.032596 2565 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-41.ec2.internal\" not found" Apr 28 19:16:56.104259 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.104184 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 28 19:16:56.105473 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.105457 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 28 19:16:56.105568 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.105487 2565 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 28 19:16:56.105568 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.105507 2565 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 28 19:16:56.105568 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.105517 2565 kubelet.go:2451] "Starting kubelet main sync loop" Apr 28 19:16:56.105568 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:56.105555 2565 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 28 19:16:56.108445 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.108427 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:56.132872 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.132853 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:56.133752 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.133737 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-41.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:56.133835 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.133763 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-41.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:56.133835 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.133774 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-41.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:56.133835 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.133796 2565 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-41.ec2.internal" Apr 28 19:16:56.142909 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.142893 2565 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-41.ec2.internal" Apr 28 19:16:56.142969 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:56.142916 2565 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-41.ec2.internal\": node \"ip-10-0-141-41.ec2.internal\" not found" Apr 28 19:16:56.157275 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:56.157256 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-41.ec2.internal\" not found" Apr 28 19:16:56.206605 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.206582 2565 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-41.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-41.ec2.internal"] Apr 28 19:16:56.206688 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.206647 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:56.207619 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.207601 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-41.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:56.207695 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.207630 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-41.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:56.207695 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.207641 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-41.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:56.208938 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.208925 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:56.209084 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.209071 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-41.ec2.internal" Apr 28 19:16:56.209122 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.209099 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:56.209664 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.209649 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-41.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:56.209735 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.209680 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-41.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:56.209735 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.209703 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-41.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:56.209735 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.209712 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-41.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:56.209843 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.209683 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-41.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:56.209843 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.209759 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-41.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:56.211051 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.211037 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-41.ec2.internal" Apr 28 19:16:56.211118 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.211061 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:56.211795 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.211780 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-41.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:56.211874 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.211804 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-41.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:56.211874 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.211818 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-41.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:56.228702 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:56.228682 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-41.ec2.internal\" not found" node="ip-10-0-141-41.ec2.internal" Apr 28 19:16:56.232690 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:56.232666 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-41.ec2.internal\" not found" node="ip-10-0-141-41.ec2.internal" Apr 28 19:16:56.257762 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:56.257745 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-41.ec2.internal\" not found" Apr 28 19:16:56.358462 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:56.358389 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-41.ec2.internal\" not found" Apr 28 19:16:56.375885 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.375864 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/17b813db0ef7dca0e7ffcd43ed6816a6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-41.ec2.internal\" (UID: \"17b813db0ef7dca0e7ffcd43ed6816a6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-41.ec2.internal" Apr 28 19:16:56.375955 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.375899 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/17b813db0ef7dca0e7ffcd43ed6816a6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-41.ec2.internal\" (UID: \"17b813db0ef7dca0e7ffcd43ed6816a6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-41.ec2.internal" Apr 28 19:16:56.375955 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.375925 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ad9143c56694e3dad0c99502369b2b4a-config\") pod \"kube-apiserver-proxy-ip-10-0-141-41.ec2.internal\" (UID: \"ad9143c56694e3dad0c99502369b2b4a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-41.ec2.internal" Apr 28 19:16:56.458937 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:56.458906 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-41.ec2.internal\" not found" Apr 28 19:16:56.476278 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.476247 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/17b813db0ef7dca0e7ffcd43ed6816a6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-41.ec2.internal\" (UID: \"17b813db0ef7dca0e7ffcd43ed6816a6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-41.ec2.internal" Apr 28 19:16:56.476327 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.476285 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/17b813db0ef7dca0e7ffcd43ed6816a6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-41.ec2.internal\" (UID: \"17b813db0ef7dca0e7ffcd43ed6816a6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-41.ec2.internal" Apr 28 19:16:56.476327 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.476309 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ad9143c56694e3dad0c99502369b2b4a-config\") pod \"kube-apiserver-proxy-ip-10-0-141-41.ec2.internal\" (UID: \"ad9143c56694e3dad0c99502369b2b4a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-41.ec2.internal" Apr 28 19:16:56.476395 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.476372 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/17b813db0ef7dca0e7ffcd43ed6816a6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-41.ec2.internal\" (UID: \"17b813db0ef7dca0e7ffcd43ed6816a6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-41.ec2.internal" Apr 28 19:16:56.476395 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.476386 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ad9143c56694e3dad0c99502369b2b4a-config\") pod \"kube-apiserver-proxy-ip-10-0-141-41.ec2.internal\" (UID: \"ad9143c56694e3dad0c99502369b2b4a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-41.ec2.internal" Apr 28 19:16:56.476466 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.476374 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/17b813db0ef7dca0e7ffcd43ed6816a6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-41.ec2.internal\" (UID: \"17b813db0ef7dca0e7ffcd43ed6816a6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-41.ec2.internal" Apr 28 19:16:56.531396 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.531354 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-41.ec2.internal" Apr 28 19:16:56.535070 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.535050 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-41.ec2.internal" Apr 28 19:16:56.559909 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:56.559879 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-41.ec2.internal\" not found" Apr 28 19:16:56.660491 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:56.660400 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-41.ec2.internal\" not found" Apr 28 19:16:56.760979 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:56.760946 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-41.ec2.internal\" not found" Apr 28 19:16:56.861476 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:56.861445 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-41.ec2.internal\" not found" Apr 28 19:16:56.875696 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.875676 2565 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 28 19:16:56.875840 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.875820 2565 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 28 19:16:56.962425 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:56.962397 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-41.ec2.internal\" not found" Apr 28 19:16:56.973562 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.973541 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 28 19:16:56.992048 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:56.992027 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 28 19:16:57.015217 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.015189 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-27 19:11:55 +0000 UTC" deadline="2027-11-10 23:48:29.641693465 +0000 UTC" Apr 28 19:16:57.015217 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.015216 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13468h31m32.626480379s" Apr 28 19:16:57.019109 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.019089 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-p4qlk" Apr 28 19:16:57.019867 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:57.019839 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17b813db0ef7dca0e7ffcd43ed6816a6.slice/crio-e4537214c73d137c3b69f12f61a619d28d36f488d3dbb7405772a3ffee8af800 WatchSource:0}: Error finding container e4537214c73d137c3b69f12f61a619d28d36f488d3dbb7405772a3ffee8af800: Status 404 returned error can't find the container with id e4537214c73d137c3b69f12f61a619d28d36f488d3dbb7405772a3ffee8af800 Apr 28 19:16:57.020342 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:57.020324 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad9143c56694e3dad0c99502369b2b4a.slice/crio-31923d1c4a6746dc268b3dc29edbf683628668911ca5adde732a85c011872e94 WatchSource:0}: Error finding container 31923d1c4a6746dc268b3dc29edbf683628668911ca5adde732a85c011872e94: Status 404 returned error can't find the container with id 31923d1c4a6746dc268b3dc29edbf683628668911ca5adde732a85c011872e94 Apr 28 19:16:57.024278 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.024263 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:16:57.025880 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.025863 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-p4qlk" Apr 28 19:16:57.062951 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:57.062914 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-41.ec2.internal\" not found" Apr 28 19:16:57.108200 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.108127 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-41.ec2.internal" event={"ID":"17b813db0ef7dca0e7ffcd43ed6816a6","Type":"ContainerStarted","Data":"e4537214c73d137c3b69f12f61a619d28d36f488d3dbb7405772a3ffee8af800"} Apr 28 19:16:57.109000 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.108980 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-41.ec2.internal" event={"ID":"ad9143c56694e3dad0c99502369b2b4a","Type":"ContainerStarted","Data":"31923d1c4a6746dc268b3dc29edbf683628668911ca5adde732a85c011872e94"} Apr 28 19:16:57.128308 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.128290 2565 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:57.163666 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:57.163644 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-41.ec2.internal\" not found" Apr 28 19:16:57.205696 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.205639 2565 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:57.275006 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.274977 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-41.ec2.internal" Apr 28 19:16:57.287737 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.287716 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 28 19:16:57.288621 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.288606 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-41.ec2.internal" Apr 28 19:16:57.298895 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.298880 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 28 19:16:57.518156 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.517953 2565 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:57.949896 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.949821 2565 apiserver.go:52] "Watching apiserver" Apr 28 19:16:57.957423 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.957399 2565 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 28 19:16:57.957846 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.957819 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-ttwth","openshift-multus/multus-64s26","openshift-multus/multus-additional-cni-plugins-ll4ff","kube-system/kube-apiserver-proxy-ip-10-0-141-41.ec2.internal","openshift-cluster-node-tuning-operator/tuned-lsmn8","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-41.ec2.internal","openshift-multus/network-metrics-daemon-hndjc","openshift-network-diagnostics/network-check-target-j22v7","openshift-network-operator/iptables-alerter-hpgl5","openshift-ovn-kubernetes/ovnkube-node-pqj2n","kube-system/konnectivity-agent-x9j9g","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw","openshift-dns/node-resolver-gg56k"] Apr 28 19:16:57.960498 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.960481 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:57.961844 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.961823 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-64s26" Apr 28 19:16:57.963277 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.963224 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 28 19:16:57.963277 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.963238 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 28 19:16:57.963277 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.963249 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 28 19:16:57.963277 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.963261 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-zzcp7\"" Apr 28 19:16:57.963527 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.963254 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 28 19:16:57.963620 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.963599 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 28 19:16:57.963917 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.963897 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 28 19:16:57.964332 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.964314 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 28 19:16:57.964610 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.964559 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 28 19:16:57.964610 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.964583 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ll4ff" Apr 28 19:16:57.964959 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.964756 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:57.965040 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.965016 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 28 19:16:57.965099 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.965065 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 28 19:16:57.965567 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.965544 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-knvt4\"" Apr 28 19:16:57.966764 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.966743 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:16:57.966861 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:57.966839 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hndjc" podUID="cbe36bec-c099-4625-b8c9-eb67c281b442" Apr 28 19:16:57.968068 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.967696 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 28 19:16:57.969596 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.968401 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 28 19:16:57.969596 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.968631 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vjt94\"" Apr 28 19:16:57.969596 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.968859 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-rdqkb\"" Apr 28 19:16:57.969596 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.968870 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:16:57.969596 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.969330 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 28 19:16:57.971724 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.971703 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:16:57.971819 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:57.971777 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j22v7" podUID="f9d394e5-59b9-48cb-b465-c8476cbd89d1" Apr 28 19:16:57.971934 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.971828 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hpgl5" Apr 28 19:16:57.974074 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.974055 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ttwth" Apr 28 19:16:57.977432 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.977194 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-x9j9g" Apr 28 19:16:57.977966 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.977947 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:16:57.978043 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.977993 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 28 19:16:57.978043 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.978023 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 28 19:16:57.978456 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.978437 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 28 19:16:57.978542 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.978486 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 28 19:16:57.978542 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.978535 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-lfw5v\"" Apr 28 19:16:57.978642 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.978442 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-bn9qj\"" Apr 28 19:16:57.978642 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.978609 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 28 19:16:57.979517 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.979498 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 28 19:16:57.979617 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.979575 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 28 19:16:57.979855 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.979839 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-ffmqm\"" Apr 28 19:16:57.980058 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.980022 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" Apr 28 19:16:57.980058 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.980056 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gg56k" Apr 28 19:16:57.982843 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.982823 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 28 19:16:57.983039 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.983020 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 28 19:16:57.983118 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.983013 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 28 19:16:57.983118 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.983050 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-2t6hn\"" Apr 28 19:16:57.983118 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.983099 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-ftc49\"" Apr 28 19:16:57.983279 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.983261 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 28 19:16:57.983373 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.983347 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 28 19:16:57.984912 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.984891 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-etc-sysconfig\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:57.985040 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.984927 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-etc-openvswitch\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:57.985040 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.984955 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-run-openvswitch\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:57.985040 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.984979 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-host-run-netns\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:57.985040 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985024 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6-system-cni-dir\") pod \"multus-additional-cni-plugins-ll4ff\" (UID: \"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6\") " pod="openshift-multus/multus-additional-cni-plugins-ll4ff" Apr 28 19:16:57.985269 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985132 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6-cnibin\") pod \"multus-additional-cni-plugins-ll4ff\" (UID: \"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6\") " pod="openshift-multus/multus-additional-cni-plugins-ll4ff" Apr 28 19:16:57.985269 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985202 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-etc-systemd\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:57.985269 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985254 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-host-slash\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:57.985426 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985284 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-node-log\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:57.985426 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985326 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-host-var-lib-kubelet\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:57.985426 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985351 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-etc-sysctl-conf\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:57.985426 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985374 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-host-cni-netd\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:57.985426 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985396 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-cnibin\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:57.985426 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985420 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-host-var-lib-cni-bin\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:57.985685 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985442 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-etc-kubernetes\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:57.985685 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985465 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6-cni-binary-copy\") pod \"multus-additional-cni-plugins-ll4ff\" (UID: \"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6\") " pod="openshift-multus/multus-additional-cni-plugins-ll4ff" Apr 28 19:16:57.985685 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985492 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-host\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:57.985685 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985515 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-run-ovn\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:57.985685 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985536 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-multus-socket-dir-parent\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:57.985685 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985560 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zvqx\" (UniqueName: \"kubernetes.io/projected/2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6-kube-api-access-4zvqx\") pod \"multus-additional-cni-plugins-ll4ff\" (UID: \"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6\") " pod="openshift-multus/multus-additional-cni-plugins-ll4ff" Apr 28 19:16:57.985685 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985590 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2df21b98-cdf7-4d04-a8e9-36920fde23bd-konnectivity-ca\") pod \"konnectivity-agent-x9j9g\" (UID: \"2df21b98-cdf7-4d04-a8e9-36920fde23bd\") " pod="kube-system/konnectivity-agent-x9j9g" Apr 28 19:16:57.985685 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985612 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-var-lib-kubelet\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:57.985685 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985644 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-host-cni-bin\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:57.985685 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985682 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/46f624dd-71ff-4136-b6dd-c90053e2799c-ovn-node-metrics-cert\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:57.986117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985706 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2df21b98-cdf7-4d04-a8e9-36920fde23bd-agent-certs\") pod \"konnectivity-agent-x9j9g\" (UID: \"2df21b98-cdf7-4d04-a8e9-36920fde23bd\") " pod="kube-system/konnectivity-agent-x9j9g" Apr 28 19:16:57.986117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985729 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-etc-sysctl-d\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:57.986117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985772 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce296200-b7fe-4c18-8f04-0fcf672d1613-tmp\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:57.986117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985811 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-var-lib-openvswitch\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:57.986117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985832 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-host-run-multus-certs\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:57.986117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985851 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-sys\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:57.986117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985876 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm7hs\" (UniqueName: \"kubernetes.io/projected/ce296200-b7fe-4c18-8f04-0fcf672d1613-kube-api-access-xm7hs\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:57.986117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985900 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-run-systemd\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:57.986117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985924 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-host-run-ovn-kubernetes\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:57.986117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985950 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/46f624dd-71ff-4136-b6dd-c90053e2799c-env-overrides\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:57.986117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.985995 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-host-run-k8s-cni-cncf-io\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:57.986117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986030 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jc2h\" (UniqueName: \"kubernetes.io/projected/2e96850e-1476-4be5-9535-99fe12d6740c-kube-api-access-9jc2h\") pod \"iptables-alerter-hpgl5\" (UID: \"2e96850e-1476-4be5-9535-99fe12d6740c\") " pod="openshift-network-operator/iptables-alerter-hpgl5" Apr 28 19:16:57.986117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986061 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-run\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:57.986117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986098 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:57.986117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986123 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6-os-release\") pod \"multus-additional-cni-plugins-ll4ff\" (UID: \"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6\") " pod="openshift-multus/multus-additional-cni-plugins-ll4ff" Apr 28 19:16:57.986768 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986146 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrz4r\" (UniqueName: \"kubernetes.io/projected/f9d394e5-59b9-48cb-b465-c8476cbd89d1-kube-api-access-hrz4r\") pod \"network-check-target-j22v7\" (UID: \"f9d394e5-59b9-48cb-b465-c8476cbd89d1\") " pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:16:57.986768 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986196 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-host-run-netns\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:57.986768 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986221 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3b36aafe-8438-4826-93f5-10d39349a4f7-cni-binary-copy\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:57.986768 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986244 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kss4\" (UniqueName: \"kubernetes.io/projected/3b36aafe-8438-4826-93f5-10d39349a4f7-kube-api-access-9kss4\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:57.986768 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986269 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ll4ff\" (UID: \"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6\") " pod="openshift-multus/multus-additional-cni-plugins-ll4ff" Apr 28 19:16:57.986768 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986292 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/784cb0c1-1c08-41f8-8c08-e92cdf0c70ce-serviceca\") pod \"node-ca-ttwth\" (UID: \"784cb0c1-1c08-41f8-8c08-e92cdf0c70ce\") " pod="openshift-image-registry/node-ca-ttwth" Apr 28 19:16:57.986768 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986315 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-multus-conf-dir\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:57.986768 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986337 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ll4ff\" (UID: \"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6\") " pod="openshift-multus/multus-additional-cni-plugins-ll4ff" Apr 28 19:16:57.986768 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986360 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-etc-modprobe-d\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:57.986768 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986383 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-etc-kubernetes\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:57.986768 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986414 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-systemd-units\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:57.986768 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986440 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2e96850e-1476-4be5-9535-99fe12d6740c-iptables-alerter-script\") pod \"iptables-alerter-hpgl5\" (UID: \"2e96850e-1476-4be5-9535-99fe12d6740c\") " pod="openshift-network-operator/iptables-alerter-hpgl5" Apr 28 19:16:57.986768 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986459 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/784cb0c1-1c08-41f8-8c08-e92cdf0c70ce-host\") pod \"node-ca-ttwth\" (UID: \"784cb0c1-1c08-41f8-8c08-e92cdf0c70ce\") " pod="openshift-image-registry/node-ca-ttwth" Apr 28 19:16:57.986768 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986514 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-host-kubelet\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:57.986768 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986564 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-log-socket\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:57.986768 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986589 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-hostroot\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:57.987502 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986605 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsrqx\" (UniqueName: \"kubernetes.io/projected/cbe36bec-c099-4625-b8c9-eb67c281b442-kube-api-access-tsrqx\") pod \"network-metrics-daemon-hndjc\" (UID: \"cbe36bec-c099-4625-b8c9-eb67c281b442\") " pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:16:57.987502 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986643 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/46f624dd-71ff-4136-b6dd-c90053e2799c-ovnkube-config\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:57.987502 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986671 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clr8g\" (UniqueName: \"kubernetes.io/projected/46f624dd-71ff-4136-b6dd-c90053e2799c-kube-api-access-clr8g\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:57.987502 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986695 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs\") pod \"network-metrics-daemon-hndjc\" (UID: \"cbe36bec-c099-4625-b8c9-eb67c281b442\") " pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:16:57.987502 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986729 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ce296200-b7fe-4c18-8f04-0fcf672d1613-etc-tuned\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:57.987502 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986754 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fll5\" (UniqueName: \"kubernetes.io/projected/784cb0c1-1c08-41f8-8c08-e92cdf0c70ce-kube-api-access-2fll5\") pod \"node-ca-ttwth\" (UID: \"784cb0c1-1c08-41f8-8c08-e92cdf0c70ce\") " pod="openshift-image-registry/node-ca-ttwth" Apr 28 19:16:57.987502 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986800 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/46f624dd-71ff-4136-b6dd-c90053e2799c-ovnkube-script-lib\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:57.987502 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986824 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-system-cni-dir\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:57.987502 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986867 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-multus-cni-dir\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:57.987502 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986904 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-host-var-lib-cni-multus\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:57.987502 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986936 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3b36aafe-8438-4826-93f5-10d39349a4f7-multus-daemon-config\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:57.987502 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986963 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-lib-modules\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:57.987502 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.986990 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-os-release\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:57.987502 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.987016 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ll4ff\" (UID: \"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6\") " pod="openshift-multus/multus-additional-cni-plugins-ll4ff" Apr 28 19:16:57.987502 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:57.987040 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e96850e-1476-4be5-9535-99fe12d6740c-host-slash\") pod \"iptables-alerter-hpgl5\" (UID: \"2e96850e-1476-4be5-9535-99fe12d6740c\") " pod="openshift-network-operator/iptables-alerter-hpgl5" Apr 28 19:16:58.027280 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.027250 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-27 19:11:57 +0000 UTC" deadline="2027-11-27 18:00:28.501870772 +0000 UTC" Apr 28 19:16:58.027280 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.027278 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13870h43m30.474595808s" Apr 28 19:16:58.075412 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.075386 2565 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 28 19:16:58.087424 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.087391 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/46f624dd-71ff-4136-b6dd-c90053e2799c-ovn-node-metrics-cert\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.087424 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.087433 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2df21b98-cdf7-4d04-a8e9-36920fde23bd-agent-certs\") pod \"konnectivity-agent-x9j9g\" (UID: \"2df21b98-cdf7-4d04-a8e9-36920fde23bd\") " pod="kube-system/konnectivity-agent-x9j9g" Apr 28 19:16:58.087635 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.087463 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a12c3700-ac6d-481d-8acc-1ed63740cc01-socket-dir\") pod \"aws-ebs-csi-driver-node-cq2pw\" (UID: \"a12c3700-ac6d-481d-8acc-1ed63740cc01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" Apr 28 19:16:58.087635 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.087489 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-etc-sysctl-d\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.087635 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.087513 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce296200-b7fe-4c18-8f04-0fcf672d1613-tmp\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.087635 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.087576 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-var-lib-openvswitch\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.087635 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.087619 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-host-run-multus-certs\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.087867 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.087648 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-sys\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.087867 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.087675 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xm7hs\" (UniqueName: \"kubernetes.io/projected/ce296200-b7fe-4c18-8f04-0fcf672d1613-kube-api-access-xm7hs\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.087867 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.087701 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-run-systemd\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.087867 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.087713 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-var-lib-openvswitch\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.087867 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.087727 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-host-run-ovn-kubernetes\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.087867 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.087769 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/46f624dd-71ff-4136-b6dd-c90053e2799c-env-overrides\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.087867 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.087780 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-host-run-ovn-kubernetes\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.087867 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.087796 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-host-run-k8s-cni-cncf-io\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.087867 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.087821 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jc2h\" (UniqueName: \"kubernetes.io/projected/2e96850e-1476-4be5-9535-99fe12d6740c-kube-api-access-9jc2h\") pod \"iptables-alerter-hpgl5\" (UID: \"2e96850e-1476-4be5-9535-99fe12d6740c\") " pod="openshift-network-operator/iptables-alerter-hpgl5" Apr 28 19:16:58.087867 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.087826 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-host-run-multus-certs\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.087867 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.087845 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-run\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.087867 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.087868 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.088422 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.087897 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6-os-release\") pod \"multus-additional-cni-plugins-ll4ff\" (UID: \"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6\") " pod="openshift-multus/multus-additional-cni-plugins-ll4ff" Apr 28 19:16:58.088422 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.087870 2565 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 28 19:16:58.088422 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.087925 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a12c3700-ac6d-481d-8acc-1ed63740cc01-device-dir\") pod \"aws-ebs-csi-driver-node-cq2pw\" (UID: \"a12c3700-ac6d-481d-8acc-1ed63740cc01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" Apr 28 19:16:58.088422 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.087943 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-sys\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.088422 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.087953 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrz4r\" (UniqueName: \"kubernetes.io/projected/f9d394e5-59b9-48cb-b465-c8476cbd89d1-kube-api-access-hrz4r\") pod \"network-check-target-j22v7\" (UID: \"f9d394e5-59b9-48cb-b465-c8476cbd89d1\") " pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:16:58.088422 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.087979 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-host-run-netns\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.088422 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088000 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-host-run-k8s-cni-cncf-io\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.088422 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088002 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3b36aafe-8438-4826-93f5-10d39349a4f7-cni-binary-copy\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.088422 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088038 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kss4\" (UniqueName: \"kubernetes.io/projected/3b36aafe-8438-4826-93f5-10d39349a4f7-kube-api-access-9kss4\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.088422 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088061 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ll4ff\" (UID: \"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6\") " pod="openshift-multus/multus-additional-cni-plugins-ll4ff" Apr 28 19:16:58.088422 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088078 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-run-systemd\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.088422 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088089 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/784cb0c1-1c08-41f8-8c08-e92cdf0c70ce-serviceca\") pod \"node-ca-ttwth\" (UID: \"784cb0c1-1c08-41f8-8c08-e92cdf0c70ce\") " pod="openshift-image-registry/node-ca-ttwth" Apr 28 19:16:58.088422 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088112 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-multus-conf-dir\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.088422 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088137 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ll4ff\" (UID: \"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6\") " pod="openshift-multus/multus-additional-cni-plugins-ll4ff" Apr 28 19:16:58.088422 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088182 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5b0e3ab7-8ea3-4aea-8300-4e7ecf70c550-tmp-dir\") pod \"node-resolver-gg56k\" (UID: \"5b0e3ab7-8ea3-4aea-8300-4e7ecf70c550\") " pod="openshift-dns/node-resolver-gg56k" Apr 28 19:16:58.088422 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088230 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-etc-modprobe-d\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.088422 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088246 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-run\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.088422 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088257 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-etc-kubernetes\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.089089 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088303 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-systemd-units\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.089089 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088334 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.089089 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088346 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2e96850e-1476-4be5-9535-99fe12d6740c-iptables-alerter-script\") pod \"iptables-alerter-hpgl5\" (UID: \"2e96850e-1476-4be5-9535-99fe12d6740c\") " pod="openshift-network-operator/iptables-alerter-hpgl5" Apr 28 19:16:58.089089 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088366 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/46f624dd-71ff-4136-b6dd-c90053e2799c-env-overrides\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.089089 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088381 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ll4ff\" (UID: \"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6\") " pod="openshift-multus/multus-additional-cni-plugins-ll4ff" Apr 28 19:16:58.089089 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088384 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mch8c\" (UniqueName: \"kubernetes.io/projected/a12c3700-ac6d-481d-8acc-1ed63740cc01-kube-api-access-mch8c\") pod \"aws-ebs-csi-driver-node-cq2pw\" (UID: \"a12c3700-ac6d-481d-8acc-1ed63740cc01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" Apr 28 19:16:58.089089 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088438 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/784cb0c1-1c08-41f8-8c08-e92cdf0c70ce-host\") pod \"node-ca-ttwth\" (UID: \"784cb0c1-1c08-41f8-8c08-e92cdf0c70ce\") " pod="openshift-image-registry/node-ca-ttwth" Apr 28 19:16:58.089089 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088458 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6-os-release\") pod \"multus-additional-cni-plugins-ll4ff\" (UID: \"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6\") " pod="openshift-multus/multus-additional-cni-plugins-ll4ff" Apr 28 19:16:58.089089 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088465 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-host-kubelet\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.089089 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088489 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-log-socket\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.089089 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088511 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-etc-kubernetes\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.089089 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088534 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-hostroot\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.089089 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088539 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-etc-modprobe-d\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.089089 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088554 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/784cb0c1-1c08-41f8-8c08-e92cdf0c70ce-host\") pod \"node-ca-ttwth\" (UID: \"784cb0c1-1c08-41f8-8c08-e92cdf0c70ce\") " pod="openshift-image-registry/node-ca-ttwth" Apr 28 19:16:58.089089 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088560 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsrqx\" (UniqueName: \"kubernetes.io/projected/cbe36bec-c099-4625-b8c9-eb67c281b442-kube-api-access-tsrqx\") pod \"network-metrics-daemon-hndjc\" (UID: \"cbe36bec-c099-4625-b8c9-eb67c281b442\") " pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:16:58.089089 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088569 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3b36aafe-8438-4826-93f5-10d39349a4f7-cni-binary-copy\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.089089 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088588 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/46f624dd-71ff-4136-b6dd-c90053e2799c-ovnkube-config\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.089842 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088591 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-multus-conf-dir\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.089842 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088602 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/784cb0c1-1c08-41f8-8c08-e92cdf0c70ce-serviceca\") pod \"node-ca-ttwth\" (UID: \"784cb0c1-1c08-41f8-8c08-e92cdf0c70ce\") " pod="openshift-image-registry/node-ca-ttwth" Apr 28 19:16:58.089842 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088631 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-log-socket\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.089842 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088640 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clr8g\" (UniqueName: \"kubernetes.io/projected/46f624dd-71ff-4136-b6dd-c90053e2799c-kube-api-access-clr8g\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.089842 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088654 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-hostroot\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.089842 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088669 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs\") pod \"network-metrics-daemon-hndjc\" (UID: \"cbe36bec-c099-4625-b8c9-eb67c281b442\") " pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:16:58.089842 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088699 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a12c3700-ac6d-481d-8acc-1ed63740cc01-registration-dir\") pod \"aws-ebs-csi-driver-node-cq2pw\" (UID: \"a12c3700-ac6d-481d-8acc-1ed63740cc01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" Apr 28 19:16:58.089842 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088727 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ce296200-b7fe-4c18-8f04-0fcf672d1613-etc-tuned\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.089842 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088755 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fll5\" (UniqueName: \"kubernetes.io/projected/784cb0c1-1c08-41f8-8c08-e92cdf0c70ce-kube-api-access-2fll5\") pod \"node-ca-ttwth\" (UID: \"784cb0c1-1c08-41f8-8c08-e92cdf0c70ce\") " pod="openshift-image-registry/node-ca-ttwth" Apr 28 19:16:58.089842 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.087906 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-etc-sysctl-d\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.089842 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088779 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/46f624dd-71ff-4136-b6dd-c90053e2799c-ovnkube-script-lib\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.089842 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088816 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-system-cni-dir\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.089842 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088845 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-multus-cni-dir\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.089842 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088855 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-host-kubelet\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.089842 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088870 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-host-var-lib-cni-multus\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.089842 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088895 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3b36aafe-8438-4826-93f5-10d39349a4f7-multus-daemon-config\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.089842 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088898 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-systemd-units\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.089842 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088907 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-host-run-netns\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.090635 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088924 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-lib-modules\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.090635 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088955 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-os-release\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.090635 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088983 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ll4ff\" (UID: \"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6\") " pod="openshift-multus/multus-additional-cni-plugins-ll4ff" Apr 28 19:16:58.090635 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089012 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e96850e-1476-4be5-9535-99fe12d6740c-host-slash\") pod \"iptables-alerter-hpgl5\" (UID: \"2e96850e-1476-4be5-9535-99fe12d6740c\") " pod="openshift-network-operator/iptables-alerter-hpgl5" Apr 28 19:16:58.090635 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089044 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wngg2\" (UniqueName: \"kubernetes.io/projected/5b0e3ab7-8ea3-4aea-8300-4e7ecf70c550-kube-api-access-wngg2\") pod \"node-resolver-gg56k\" (UID: \"5b0e3ab7-8ea3-4aea-8300-4e7ecf70c550\") " pod="openshift-dns/node-resolver-gg56k" Apr 28 19:16:58.090635 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089078 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ll4ff\" (UID: \"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6\") " pod="openshift-multus/multus-additional-cni-plugins-ll4ff" Apr 28 19:16:58.090635 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089071 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-etc-sysconfig\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.090635 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089140 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-etc-openvswitch\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.090635 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089145 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-etc-sysconfig\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.090635 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089174 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-os-release\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.090635 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089191 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-run-openvswitch\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.090635 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089218 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e96850e-1476-4be5-9535-99fe12d6740c-host-slash\") pod \"iptables-alerter-hpgl5\" (UID: \"2e96850e-1476-4be5-9535-99fe12d6740c\") " pod="openshift-network-operator/iptables-alerter-hpgl5" Apr 28 19:16:58.090635 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089251 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-host-run-netns\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.090635 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089288 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-system-cni-dir\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.090635 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089297 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6-system-cni-dir\") pod \"multus-additional-cni-plugins-ll4ff\" (UID: \"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6\") " pod="openshift-multus/multus-additional-cni-plugins-ll4ff" Apr 28 19:16:58.090635 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089307 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/46f624dd-71ff-4136-b6dd-c90053e2799c-ovnkube-config\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.090635 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089325 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6-cnibin\") pod \"multus-additional-cni-plugins-ll4ff\" (UID: \"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6\") " pod="openshift-multus/multus-additional-cni-plugins-ll4ff" Apr 28 19:16:58.091388 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089355 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5b0e3ab7-8ea3-4aea-8300-4e7ecf70c550-hosts-file\") pod \"node-resolver-gg56k\" (UID: \"5b0e3ab7-8ea3-4aea-8300-4e7ecf70c550\") " pod="openshift-dns/node-resolver-gg56k" Apr 28 19:16:58.091388 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089362 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-host-var-lib-cni-multus\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.091388 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089354 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-multus-cni-dir\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.091388 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089388 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-etc-systemd\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.091388 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089406 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-etc-openvswitch\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.091388 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089417 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3b36aafe-8438-4826-93f5-10d39349a4f7-multus-daemon-config\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.091388 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089451 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-host-run-netns\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.091388 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089450 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6-system-cni-dir\") pod \"multus-additional-cni-plugins-ll4ff\" (UID: \"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6\") " pod="openshift-multus/multus-additional-cni-plugins-ll4ff" Apr 28 19:16:58.091388 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089465 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-host-slash\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.091388 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089497 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-node-log\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.091388 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089511 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-etc-systemd\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.091388 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.088855 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2e96850e-1476-4be5-9535-99fe12d6740c-iptables-alerter-script\") pod \"iptables-alerter-hpgl5\" (UID: \"2e96850e-1476-4be5-9535-99fe12d6740c\") " pod="openshift-network-operator/iptables-alerter-hpgl5" Apr 28 19:16:58.091388 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089539 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-node-log\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.091388 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089551 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6-cnibin\") pod \"multus-additional-cni-plugins-ll4ff\" (UID: \"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6\") " pod="openshift-multus/multus-additional-cni-plugins-ll4ff" Apr 28 19:16:58.091388 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089558 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-host-var-lib-kubelet\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.091388 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089583 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-host-slash\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.091388 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089587 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-etc-sysctl-conf\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.091388 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089611 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ll4ff\" (UID: \"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6\") " pod="openshift-multus/multus-additional-cni-plugins-ll4ff" Apr 28 19:16:58.092226 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089645 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-host-cni-netd\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.092226 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:58.089651 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:58.092226 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089638 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-lib-modules\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.092226 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089665 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-host-var-lib-kubelet\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.092226 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089614 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-host-cni-netd\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.092226 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089686 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-run-openvswitch\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.092226 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:58.089772 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs podName:cbe36bec-c099-4625-b8c9-eb67c281b442 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:58.589730167 +0000 UTC m=+3.095446821 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs") pod "network-metrics-daemon-hndjc" (UID: "cbe36bec-c099-4625-b8c9-eb67c281b442") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:58.092226 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089815 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-etc-sysctl-conf\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.092226 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089849 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-cnibin\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.092226 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089888 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-host-var-lib-cni-bin\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.092226 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089921 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-host-var-lib-cni-bin\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.092226 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089968 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-etc-kubernetes\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.092226 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.089973 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-cnibin\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.092226 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.090004 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6-cni-binary-copy\") pod \"multus-additional-cni-plugins-ll4ff\" (UID: \"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6\") " pod="openshift-multus/multus-additional-cni-plugins-ll4ff" Apr 28 19:16:58.092226 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.090012 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-etc-kubernetes\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.092226 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.090035 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a12c3700-ac6d-481d-8acc-1ed63740cc01-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cq2pw\" (UID: \"a12c3700-ac6d-481d-8acc-1ed63740cc01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" Apr 28 19:16:58.092226 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.090062 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-host\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.092226 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.090087 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-run-ovn\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.093117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.090119 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-multus-socket-dir-parent\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.093117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.090178 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-host\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.093117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.090190 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zvqx\" (UniqueName: \"kubernetes.io/projected/2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6-kube-api-access-4zvqx\") pod \"multus-additional-cni-plugins-ll4ff\" (UID: \"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6\") " pod="openshift-multus/multus-additional-cni-plugins-ll4ff" Apr 28 19:16:58.093117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.090207 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-run-ovn\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.093117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.090239 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3b36aafe-8438-4826-93f5-10d39349a4f7-multus-socket-dir-parent\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.093117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.090223 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2df21b98-cdf7-4d04-a8e9-36920fde23bd-konnectivity-ca\") pod \"konnectivity-agent-x9j9g\" (UID: \"2df21b98-cdf7-4d04-a8e9-36920fde23bd\") " pod="kube-system/konnectivity-agent-x9j9g" Apr 28 19:16:58.093117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.090295 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a12c3700-ac6d-481d-8acc-1ed63740cc01-etc-selinux\") pod \"aws-ebs-csi-driver-node-cq2pw\" (UID: \"a12c3700-ac6d-481d-8acc-1ed63740cc01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" Apr 28 19:16:58.093117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.090319 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a12c3700-ac6d-481d-8acc-1ed63740cc01-sys-fs\") pod \"aws-ebs-csi-driver-node-cq2pw\" (UID: \"a12c3700-ac6d-481d-8acc-1ed63740cc01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" Apr 28 19:16:58.093117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.090346 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-var-lib-kubelet\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.093117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.090371 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-host-cni-bin\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.093117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.090459 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/46f624dd-71ff-4136-b6dd-c90053e2799c-host-cni-bin\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.093117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.090467 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce296200-b7fe-4c18-8f04-0fcf672d1613-var-lib-kubelet\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.093117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.090487 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6-cni-binary-copy\") pod \"multus-additional-cni-plugins-ll4ff\" (UID: \"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6\") " pod="openshift-multus/multus-additional-cni-plugins-ll4ff" Apr 28 19:16:58.093117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.090635 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/46f624dd-71ff-4136-b6dd-c90053e2799c-ovnkube-script-lib\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.093117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.090747 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2df21b98-cdf7-4d04-a8e9-36920fde23bd-konnectivity-ca\") pod \"konnectivity-agent-x9j9g\" (UID: \"2df21b98-cdf7-4d04-a8e9-36920fde23bd\") " pod="kube-system/konnectivity-agent-x9j9g" Apr 28 19:16:58.093117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.092317 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce296200-b7fe-4c18-8f04-0fcf672d1613-tmp\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.093117 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.092457 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/46f624dd-71ff-4136-b6dd-c90053e2799c-ovn-node-metrics-cert\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.093906 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.092516 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ce296200-b7fe-4c18-8f04-0fcf672d1613-etc-tuned\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.093906 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.092877 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2df21b98-cdf7-4d04-a8e9-36920fde23bd-agent-certs\") pod \"konnectivity-agent-x9j9g\" (UID: \"2df21b98-cdf7-4d04-a8e9-36920fde23bd\") " pod="kube-system/konnectivity-agent-x9j9g" Apr 28 19:16:58.104583 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.104557 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm7hs\" (UniqueName: \"kubernetes.io/projected/ce296200-b7fe-4c18-8f04-0fcf672d1613-kube-api-access-xm7hs\") pod \"tuned-lsmn8\" (UID: \"ce296200-b7fe-4c18-8f04-0fcf672d1613\") " pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.113342 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:58.113321 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:58.113454 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:58.113346 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:58.113454 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:58.113360 2565 projected.go:194] Error preparing data for projected volume kube-api-access-hrz4r for pod openshift-network-diagnostics/network-check-target-j22v7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:58.113454 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:58.113423 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f9d394e5-59b9-48cb-b465-c8476cbd89d1-kube-api-access-hrz4r podName:f9d394e5-59b9-48cb-b465-c8476cbd89d1 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:58.613403855 +0000 UTC m=+3.119120506 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hrz4r" (UniqueName: "kubernetes.io/projected/f9d394e5-59b9-48cb-b465-c8476cbd89d1-kube-api-access-hrz4r") pod "network-check-target-j22v7" (UID: "f9d394e5-59b9-48cb-b465-c8476cbd89d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:58.116932 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.116900 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fll5\" (UniqueName: \"kubernetes.io/projected/784cb0c1-1c08-41f8-8c08-e92cdf0c70ce-kube-api-access-2fll5\") pod \"node-ca-ttwth\" (UID: \"784cb0c1-1c08-41f8-8c08-e92cdf0c70ce\") " pod="openshift-image-registry/node-ca-ttwth" Apr 28 19:16:58.117650 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.117627 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsrqx\" (UniqueName: \"kubernetes.io/projected/cbe36bec-c099-4625-b8c9-eb67c281b442-kube-api-access-tsrqx\") pod \"network-metrics-daemon-hndjc\" (UID: \"cbe36bec-c099-4625-b8c9-eb67c281b442\") " pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:16:58.117980 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.117953 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jc2h\" (UniqueName: \"kubernetes.io/projected/2e96850e-1476-4be5-9535-99fe12d6740c-kube-api-access-9jc2h\") pod \"iptables-alerter-hpgl5\" (UID: \"2e96850e-1476-4be5-9535-99fe12d6740c\") " pod="openshift-network-operator/iptables-alerter-hpgl5" Apr 28 19:16:58.119055 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.119018 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kss4\" (UniqueName: \"kubernetes.io/projected/3b36aafe-8438-4826-93f5-10d39349a4f7-kube-api-access-9kss4\") pod \"multus-64s26\" (UID: \"3b36aafe-8438-4826-93f5-10d39349a4f7\") " pod="openshift-multus/multus-64s26" Apr 28 19:16:58.119447 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.119422 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clr8g\" (UniqueName: \"kubernetes.io/projected/46f624dd-71ff-4136-b6dd-c90053e2799c-kube-api-access-clr8g\") pod \"ovnkube-node-pqj2n\" (UID: \"46f624dd-71ff-4136-b6dd-c90053e2799c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.119722 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.119698 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zvqx\" (UniqueName: \"kubernetes.io/projected/2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6-kube-api-access-4zvqx\") pod \"multus-additional-cni-plugins-ll4ff\" (UID: \"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6\") " pod="openshift-multus/multus-additional-cni-plugins-ll4ff" Apr 28 19:16:58.191317 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.191289 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a12c3700-ac6d-481d-8acc-1ed63740cc01-device-dir\") pod \"aws-ebs-csi-driver-node-cq2pw\" (UID: \"a12c3700-ac6d-481d-8acc-1ed63740cc01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" Apr 28 19:16:58.191516 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.191335 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5b0e3ab7-8ea3-4aea-8300-4e7ecf70c550-tmp-dir\") pod \"node-resolver-gg56k\" (UID: \"5b0e3ab7-8ea3-4aea-8300-4e7ecf70c550\") " pod="openshift-dns/node-resolver-gg56k" Apr 28 19:16:58.191516 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.191360 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mch8c\" (UniqueName: \"kubernetes.io/projected/a12c3700-ac6d-481d-8acc-1ed63740cc01-kube-api-access-mch8c\") pod \"aws-ebs-csi-driver-node-cq2pw\" (UID: \"a12c3700-ac6d-481d-8acc-1ed63740cc01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" Apr 28 19:16:58.191516 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.191400 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a12c3700-ac6d-481d-8acc-1ed63740cc01-registration-dir\") pod \"aws-ebs-csi-driver-node-cq2pw\" (UID: \"a12c3700-ac6d-481d-8acc-1ed63740cc01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" Apr 28 19:16:58.191516 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.191420 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a12c3700-ac6d-481d-8acc-1ed63740cc01-device-dir\") pod \"aws-ebs-csi-driver-node-cq2pw\" (UID: \"a12c3700-ac6d-481d-8acc-1ed63740cc01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" Apr 28 19:16:58.191516 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.191431 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wngg2\" (UniqueName: \"kubernetes.io/projected/5b0e3ab7-8ea3-4aea-8300-4e7ecf70c550-kube-api-access-wngg2\") pod \"node-resolver-gg56k\" (UID: \"5b0e3ab7-8ea3-4aea-8300-4e7ecf70c550\") " pod="openshift-dns/node-resolver-gg56k" Apr 28 19:16:58.191516 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.191468 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5b0e3ab7-8ea3-4aea-8300-4e7ecf70c550-hosts-file\") pod \"node-resolver-gg56k\" (UID: \"5b0e3ab7-8ea3-4aea-8300-4e7ecf70c550\") " pod="openshift-dns/node-resolver-gg56k" Apr 28 19:16:58.191516 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.191498 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a12c3700-ac6d-481d-8acc-1ed63740cc01-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cq2pw\" (UID: \"a12c3700-ac6d-481d-8acc-1ed63740cc01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" Apr 28 19:16:58.191837 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.191526 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a12c3700-ac6d-481d-8acc-1ed63740cc01-etc-selinux\") pod \"aws-ebs-csi-driver-node-cq2pw\" (UID: \"a12c3700-ac6d-481d-8acc-1ed63740cc01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" Apr 28 19:16:58.191837 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.191538 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a12c3700-ac6d-481d-8acc-1ed63740cc01-registration-dir\") pod \"aws-ebs-csi-driver-node-cq2pw\" (UID: \"a12c3700-ac6d-481d-8acc-1ed63740cc01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" Apr 28 19:16:58.191837 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.191598 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5b0e3ab7-8ea3-4aea-8300-4e7ecf70c550-hosts-file\") pod \"node-resolver-gg56k\" (UID: \"5b0e3ab7-8ea3-4aea-8300-4e7ecf70c550\") " pod="openshift-dns/node-resolver-gg56k" Apr 28 19:16:58.191837 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.191608 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a12c3700-ac6d-481d-8acc-1ed63740cc01-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cq2pw\" (UID: \"a12c3700-ac6d-481d-8acc-1ed63740cc01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" Apr 28 19:16:58.191837 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.191550 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a12c3700-ac6d-481d-8acc-1ed63740cc01-sys-fs\") pod \"aws-ebs-csi-driver-node-cq2pw\" (UID: \"a12c3700-ac6d-481d-8acc-1ed63740cc01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" Apr 28 19:16:58.191837 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.191656 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5b0e3ab7-8ea3-4aea-8300-4e7ecf70c550-tmp-dir\") pod \"node-resolver-gg56k\" (UID: \"5b0e3ab7-8ea3-4aea-8300-4e7ecf70c550\") " pod="openshift-dns/node-resolver-gg56k" Apr 28 19:16:58.191837 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.191680 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a12c3700-ac6d-481d-8acc-1ed63740cc01-etc-selinux\") pod \"aws-ebs-csi-driver-node-cq2pw\" (UID: \"a12c3700-ac6d-481d-8acc-1ed63740cc01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" Apr 28 19:16:58.191837 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.191675 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a12c3700-ac6d-481d-8acc-1ed63740cc01-socket-dir\") pod \"aws-ebs-csi-driver-node-cq2pw\" (UID: \"a12c3700-ac6d-481d-8acc-1ed63740cc01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" Apr 28 19:16:58.191837 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.191716 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a12c3700-ac6d-481d-8acc-1ed63740cc01-sys-fs\") pod \"aws-ebs-csi-driver-node-cq2pw\" (UID: \"a12c3700-ac6d-481d-8acc-1ed63740cc01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" Apr 28 19:16:58.191837 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.191834 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a12c3700-ac6d-481d-8acc-1ed63740cc01-socket-dir\") pod \"aws-ebs-csi-driver-node-cq2pw\" (UID: \"a12c3700-ac6d-481d-8acc-1ed63740cc01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" Apr 28 19:16:58.203538 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.203471 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mch8c\" (UniqueName: \"kubernetes.io/projected/a12c3700-ac6d-481d-8acc-1ed63740cc01-kube-api-access-mch8c\") pod \"aws-ebs-csi-driver-node-cq2pw\" (UID: \"a12c3700-ac6d-481d-8acc-1ed63740cc01\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" Apr 28 19:16:58.203538 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.203473 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wngg2\" (UniqueName: \"kubernetes.io/projected/5b0e3ab7-8ea3-4aea-8300-4e7ecf70c550-kube-api-access-wngg2\") pod \"node-resolver-gg56k\" (UID: \"5b0e3ab7-8ea3-4aea-8300-4e7ecf70c550\") " pod="openshift-dns/node-resolver-gg56k" Apr 28 19:16:58.274478 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.274442 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:16:58.279723 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.279699 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-64s26" Apr 28 19:16:58.288705 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.288683 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ll4ff" Apr 28 19:16:58.294345 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.294328 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" Apr 28 19:16:58.300878 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.300860 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" Apr 28 19:16:58.308405 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.308389 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hpgl5" Apr 28 19:16:58.314915 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.314896 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ttwth" Apr 28 19:16:58.321489 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.321470 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-x9j9g" Apr 28 19:16:58.325974 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.325955 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gg56k" Apr 28 19:16:58.381038 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.381008 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:58.594435 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.594346 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs\") pod \"network-metrics-daemon-hndjc\" (UID: \"cbe36bec-c099-4625-b8c9-eb67c281b442\") " pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:16:58.594588 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:58.594516 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:58.594646 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:58.594590 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs podName:cbe36bec-c099-4625-b8c9-eb67c281b442 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:59.594566382 +0000 UTC m=+4.100283031 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs") pod "network-metrics-daemon-hndjc" (UID: "cbe36bec-c099-4625-b8c9-eb67c281b442") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:58.685022 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:58.684981 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce296200_b7fe_4c18_8f04_0fcf672d1613.slice/crio-8515b4870647062bc885bdd07af186cc302e2af1876735f363ff92435865ed64 WatchSource:0}: Error finding container 8515b4870647062bc885bdd07af186cc302e2af1876735f363ff92435865ed64: Status 404 returned error can't find the container with id 8515b4870647062bc885bdd07af186cc302e2af1876735f363ff92435865ed64 Apr 28 19:16:58.685938 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:58.685908 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2df21b98_cdf7_4d04_a8e9_36920fde23bd.slice/crio-b77d15ad868c1fdb047381447c107cb2e335470aedc16323f3f6cb72b8adb43f WatchSource:0}: Error finding container b77d15ad868c1fdb047381447c107cb2e335470aedc16323f3f6cb72b8adb43f: Status 404 returned error can't find the container with id b77d15ad868c1fdb047381447c107cb2e335470aedc16323f3f6cb72b8adb43f Apr 28 19:16:58.686646 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:58.686610 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda12c3700_ac6d_481d_8acc_1ed63740cc01.slice/crio-a3066cae645ae767a78c81d868b4d61a2b0fbbd4aab5599819e9c190e5c1d03c WatchSource:0}: Error finding container a3066cae645ae767a78c81d868b4d61a2b0fbbd4aab5599819e9c190e5c1d03c: Status 404 returned error can't find the container with id a3066cae645ae767a78c81d868b4d61a2b0fbbd4aab5599819e9c190e5c1d03c Apr 28 19:16:58.687681 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:58.687533 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b36aafe_8438_4826_93f5_10d39349a4f7.slice/crio-b51048becc5af6d2e830a91f7c9be030d025f2870e5ead5d536811feae3b2ad3 WatchSource:0}: Error finding container b51048becc5af6d2e830a91f7c9be030d025f2870e5ead5d536811feae3b2ad3: Status 404 returned error can't find the container with id b51048becc5af6d2e830a91f7c9be030d025f2870e5ead5d536811feae3b2ad3 Apr 28 19:16:58.689415 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:58.689369 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e4607bc_ad98_4ac7_a34e_0cf93fb04ba6.slice/crio-55dfd9dd4faef31568f46d3206895e6de93cad3491626598d82b44df438e7da1 WatchSource:0}: Error finding container 55dfd9dd4faef31568f46d3206895e6de93cad3491626598d82b44df438e7da1: Status 404 returned error can't find the container with id 55dfd9dd4faef31568f46d3206895e6de93cad3491626598d82b44df438e7da1 Apr 28 19:16:58.692657 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:58.692618 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46f624dd_71ff_4136_b6dd_c90053e2799c.slice/crio-ba013340da7b2e9fdcadf88461228a0a48507be87e05c779befbdcacfc1c052b WatchSource:0}: Error finding container ba013340da7b2e9fdcadf88461228a0a48507be87e05c779befbdcacfc1c052b: Status 404 returned error can't find the container with id ba013340da7b2e9fdcadf88461228a0a48507be87e05c779befbdcacfc1c052b Apr 28 19:16:58.693539 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:58.693517 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod784cb0c1_1c08_41f8_8c08_e92cdf0c70ce.slice/crio-bbed4d2cb9c1cb4f45135492f255b0d185cca979c1667b6672dbb318ac90fcd2 WatchSource:0}: Error finding container bbed4d2cb9c1cb4f45135492f255b0d185cca979c1667b6672dbb318ac90fcd2: Status 404 returned error can't find the container with id bbed4d2cb9c1cb4f45135492f255b0d185cca979c1667b6672dbb318ac90fcd2 Apr 28 19:16:58.694709 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:58.694690 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e96850e_1476_4be5_9535_99fe12d6740c.slice/crio-f5097a45902cbe457d45543e37629f06a8746698929cc60350c1403ef0c5a113 WatchSource:0}: Error finding container f5097a45902cbe457d45543e37629f06a8746698929cc60350c1403ef0c5a113: Status 404 returned error can't find the container with id f5097a45902cbe457d45543e37629f06a8746698929cc60350c1403ef0c5a113 Apr 28 19:16:58.694915 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:58.694874 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrz4r\" (UniqueName: \"kubernetes.io/projected/f9d394e5-59b9-48cb-b465-c8476cbd89d1-kube-api-access-hrz4r\") pod \"network-check-target-j22v7\" (UID: \"f9d394e5-59b9-48cb-b465-c8476cbd89d1\") " pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:16:58.695069 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:58.695055 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:58.695121 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:58.695073 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:58.695121 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:58.695087 2565 projected.go:194] Error preparing data for projected volume kube-api-access-hrz4r for pod openshift-network-diagnostics/network-check-target-j22v7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:58.695250 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:58.695136 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f9d394e5-59b9-48cb-b465-c8476cbd89d1-kube-api-access-hrz4r podName:f9d394e5-59b9-48cb-b465-c8476cbd89d1 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:59.695116451 +0000 UTC m=+4.200833110 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-hrz4r" (UniqueName: "kubernetes.io/projected/f9d394e5-59b9-48cb-b465-c8476cbd89d1-kube-api-access-hrz4r") pod "network-check-target-j22v7" (UID: "f9d394e5-59b9-48cb-b465-c8476cbd89d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:58.695494 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:16:58.695470 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b0e3ab7_8ea3_4aea_8300_4e7ecf70c550.slice/crio-9ea56a670468a6ceb72918902fa0d4d67bf4e49b248d6d291f603165056572cb WatchSource:0}: Error finding container 9ea56a670468a6ceb72918902fa0d4d67bf4e49b248d6d291f603165056572cb: Status 404 returned error can't find the container with id 9ea56a670468a6ceb72918902fa0d4d67bf4e49b248d6d291f603165056572cb Apr 28 19:16:59.028598 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:59.028241 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-27 19:11:57 +0000 UTC" deadline="2028-02-11 19:49:29.465884088 +0000 UTC" Apr 28 19:16:59.028598 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:59.028502 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15696h32m30.437388857s" Apr 28 19:16:59.120534 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:59.120496 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-41.ec2.internal" event={"ID":"ad9143c56694e3dad0c99502369b2b4a","Type":"ContainerStarted","Data":"fccf7c93bade2da8054b1c36a6be355c807e9afb03188b0a0d25470d5e142f2d"} Apr 28 19:16:59.126147 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:59.126117 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gg56k" event={"ID":"5b0e3ab7-8ea3-4aea-8300-4e7ecf70c550","Type":"ContainerStarted","Data":"9ea56a670468a6ceb72918902fa0d4d67bf4e49b248d6d291f603165056572cb"} Apr 28 19:16:59.128070 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:59.128019 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hpgl5" event={"ID":"2e96850e-1476-4be5-9535-99fe12d6740c","Type":"ContainerStarted","Data":"f5097a45902cbe457d45543e37629f06a8746698929cc60350c1403ef0c5a113"} Apr 28 19:16:59.135260 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:59.135221 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ttwth" event={"ID":"784cb0c1-1c08-41f8-8c08-e92cdf0c70ce","Type":"ContainerStarted","Data":"bbed4d2cb9c1cb4f45135492f255b0d185cca979c1667b6672dbb318ac90fcd2"} Apr 28 19:16:59.135972 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:59.135919 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-41.ec2.internal" podStartSLOduration=2.135905638 podStartE2EDuration="2.135905638s" podCreationTimestamp="2026-04-28 19:16:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:16:59.135501797 +0000 UTC m=+3.641218463" watchObservedRunningTime="2026-04-28 19:16:59.135905638 +0000 UTC m=+3.641622307" Apr 28 19:16:59.142888 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:59.142865 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" event={"ID":"46f624dd-71ff-4136-b6dd-c90053e2799c","Type":"ContainerStarted","Data":"ba013340da7b2e9fdcadf88461228a0a48507be87e05c779befbdcacfc1c052b"} Apr 28 19:16:59.144448 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:59.144425 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ll4ff" event={"ID":"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6","Type":"ContainerStarted","Data":"55dfd9dd4faef31568f46d3206895e6de93cad3491626598d82b44df438e7da1"} Apr 28 19:16:59.158645 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:59.158619 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-64s26" event={"ID":"3b36aafe-8438-4826-93f5-10d39349a4f7","Type":"ContainerStarted","Data":"b51048becc5af6d2e830a91f7c9be030d025f2870e5ead5d536811feae3b2ad3"} Apr 28 19:16:59.165073 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:59.160892 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" event={"ID":"a12c3700-ac6d-481d-8acc-1ed63740cc01","Type":"ContainerStarted","Data":"a3066cae645ae767a78c81d868b4d61a2b0fbbd4aab5599819e9c190e5c1d03c"} Apr 28 19:16:59.165073 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:59.163922 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-x9j9g" event={"ID":"2df21b98-cdf7-4d04-a8e9-36920fde23bd","Type":"ContainerStarted","Data":"b77d15ad868c1fdb047381447c107cb2e335470aedc16323f3f6cb72b8adb43f"} Apr 28 19:16:59.166235 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:59.165420 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" event={"ID":"ce296200-b7fe-4c18-8f04-0fcf672d1613","Type":"ContainerStarted","Data":"8515b4870647062bc885bdd07af186cc302e2af1876735f363ff92435865ed64"} Apr 28 19:16:59.605439 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:59.605043 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs\") pod \"network-metrics-daemon-hndjc\" (UID: \"cbe36bec-c099-4625-b8c9-eb67c281b442\") " pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:16:59.605439 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:59.605259 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:59.605439 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:59.605372 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs podName:cbe36bec-c099-4625-b8c9-eb67c281b442 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:01.6053522 +0000 UTC m=+6.111068850 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs") pod "network-metrics-daemon-hndjc" (UID: "cbe36bec-c099-4625-b8c9-eb67c281b442") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:59.706350 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:16:59.706315 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrz4r\" (UniqueName: \"kubernetes.io/projected/f9d394e5-59b9-48cb-b465-c8476cbd89d1-kube-api-access-hrz4r\") pod \"network-check-target-j22v7\" (UID: \"f9d394e5-59b9-48cb-b465-c8476cbd89d1\") " pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:16:59.706532 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:59.706471 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:59.706532 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:59.706489 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:59.706532 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:59.706502 2565 projected.go:194] Error preparing data for projected volume kube-api-access-hrz4r for pod openshift-network-diagnostics/network-check-target-j22v7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:59.706683 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:16:59.706568 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f9d394e5-59b9-48cb-b465-c8476cbd89d1-kube-api-access-hrz4r podName:f9d394e5-59b9-48cb-b465-c8476cbd89d1 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:01.706549264 +0000 UTC m=+6.212265922 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-hrz4r" (UniqueName: "kubernetes.io/projected/f9d394e5-59b9-48cb-b465-c8476cbd89d1-kube-api-access-hrz4r") pod "network-check-target-j22v7" (UID: "f9d394e5-59b9-48cb-b465-c8476cbd89d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:17:00.109994 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:00.108990 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:17:00.109994 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:00.109114 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j22v7" podUID="f9d394e5-59b9-48cb-b465-c8476cbd89d1" Apr 28 19:17:00.109994 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:00.109562 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:17:00.109994 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:00.109668 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hndjc" podUID="cbe36bec-c099-4625-b8c9-eb67c281b442" Apr 28 19:17:00.179388 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:00.179307 2565 generic.go:358] "Generic (PLEG): container finished" podID="17b813db0ef7dca0e7ffcd43ed6816a6" containerID="6b8befa5654950fca9051c6a67bb7ca3d8dc6484872e178fc1a41fdde0b861ee" exitCode=0 Apr 28 19:17:00.180275 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:00.180243 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-41.ec2.internal" event={"ID":"17b813db0ef7dca0e7ffcd43ed6816a6","Type":"ContainerDied","Data":"6b8befa5654950fca9051c6a67bb7ca3d8dc6484872e178fc1a41fdde0b861ee"} Apr 28 19:17:01.191112 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:01.191079 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-41.ec2.internal" event={"ID":"17b813db0ef7dca0e7ffcd43ed6816a6","Type":"ContainerStarted","Data":"b63a3d82fbce946c9e003278a86256c86711e0490818736709b9a26fefe8811a"} Apr 28 19:17:01.623295 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:01.623202 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs\") pod \"network-metrics-daemon-hndjc\" (UID: \"cbe36bec-c099-4625-b8c9-eb67c281b442\") " pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:17:01.623460 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:01.623389 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:17:01.623518 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:01.623472 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs podName:cbe36bec-c099-4625-b8c9-eb67c281b442 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:05.623452157 +0000 UTC m=+10.129168818 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs") pod "network-metrics-daemon-hndjc" (UID: "cbe36bec-c099-4625-b8c9-eb67c281b442") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:17:01.723900 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:01.723863 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrz4r\" (UniqueName: \"kubernetes.io/projected/f9d394e5-59b9-48cb-b465-c8476cbd89d1-kube-api-access-hrz4r\") pod \"network-check-target-j22v7\" (UID: \"f9d394e5-59b9-48cb-b465-c8476cbd89d1\") " pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:17:01.724073 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:01.724044 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:17:01.724073 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:01.724064 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:17:01.724199 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:01.724078 2565 projected.go:194] Error preparing data for projected volume kube-api-access-hrz4r for pod openshift-network-diagnostics/network-check-target-j22v7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:17:01.724199 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:01.724143 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f9d394e5-59b9-48cb-b465-c8476cbd89d1-kube-api-access-hrz4r podName:f9d394e5-59b9-48cb-b465-c8476cbd89d1 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:05.724123314 +0000 UTC m=+10.229839973 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-hrz4r" (UniqueName: "kubernetes.io/projected/f9d394e5-59b9-48cb-b465-c8476cbd89d1-kube-api-access-hrz4r") pod "network-check-target-j22v7" (UID: "f9d394e5-59b9-48cb-b465-c8476cbd89d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:17:02.106152 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:02.106065 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:17:02.106322 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:02.106227 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hndjc" podUID="cbe36bec-c099-4625-b8c9-eb67c281b442" Apr 28 19:17:02.106421 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:02.106398 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:17:02.106522 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:02.106501 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j22v7" podUID="f9d394e5-59b9-48cb-b465-c8476cbd89d1" Apr 28 19:17:04.106430 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:04.106392 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:17:04.106829 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:04.106393 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:17:04.106829 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:04.106533 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j22v7" podUID="f9d394e5-59b9-48cb-b465-c8476cbd89d1" Apr 28 19:17:04.106829 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:04.106664 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hndjc" podUID="cbe36bec-c099-4625-b8c9-eb67c281b442" Apr 28 19:17:05.655491 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:05.655448 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs\") pod \"network-metrics-daemon-hndjc\" (UID: \"cbe36bec-c099-4625-b8c9-eb67c281b442\") " pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:17:05.655943 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:05.655596 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:17:05.655943 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:05.655670 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs podName:cbe36bec-c099-4625-b8c9-eb67c281b442 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:13.655649452 +0000 UTC m=+18.161366102 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs") pod "network-metrics-daemon-hndjc" (UID: "cbe36bec-c099-4625-b8c9-eb67c281b442") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:17:05.756369 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:05.756333 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrz4r\" (UniqueName: \"kubernetes.io/projected/f9d394e5-59b9-48cb-b465-c8476cbd89d1-kube-api-access-hrz4r\") pod \"network-check-target-j22v7\" (UID: \"f9d394e5-59b9-48cb-b465-c8476cbd89d1\") " pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:17:05.756579 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:05.756557 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:17:05.756654 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:05.756583 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:17:05.756654 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:05.756600 2565 projected.go:194] Error preparing data for projected volume kube-api-access-hrz4r for pod openshift-network-diagnostics/network-check-target-j22v7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:17:05.756757 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:05.756670 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f9d394e5-59b9-48cb-b465-c8476cbd89d1-kube-api-access-hrz4r podName:f9d394e5-59b9-48cb-b465-c8476cbd89d1 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:13.756650535 +0000 UTC m=+18.262367183 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-hrz4r" (UniqueName: "kubernetes.io/projected/f9d394e5-59b9-48cb-b465-c8476cbd89d1-kube-api-access-hrz4r") pod "network-check-target-j22v7" (UID: "f9d394e5-59b9-48cb-b465-c8476cbd89d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:17:06.107776 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:06.107541 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:17:06.107776 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:06.107650 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j22v7" podUID="f9d394e5-59b9-48cb-b465-c8476cbd89d1" Apr 28 19:17:06.107776 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:06.107687 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:17:06.108057 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:06.107801 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hndjc" podUID="cbe36bec-c099-4625-b8c9-eb67c281b442" Apr 28 19:17:08.106335 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:08.106264 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:17:08.106783 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:08.106272 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:17:08.106783 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:08.106413 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hndjc" podUID="cbe36bec-c099-4625-b8c9-eb67c281b442" Apr 28 19:17:08.106783 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:08.106479 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j22v7" podUID="f9d394e5-59b9-48cb-b465-c8476cbd89d1" Apr 28 19:17:10.106573 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:10.106538 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:17:10.107011 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:10.106651 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j22v7" podUID="f9d394e5-59b9-48cb-b465-c8476cbd89d1" Apr 28 19:17:10.107011 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:10.106710 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:17:10.107011 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:10.106807 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hndjc" podUID="cbe36bec-c099-4625-b8c9-eb67c281b442" Apr 28 19:17:12.106397 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:12.106364 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:17:12.106845 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:12.106515 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hndjc" podUID="cbe36bec-c099-4625-b8c9-eb67c281b442" Apr 28 19:17:12.106845 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:12.106555 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:17:12.106845 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:12.106663 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j22v7" podUID="f9d394e5-59b9-48cb-b465-c8476cbd89d1" Apr 28 19:17:13.716287 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:13.716242 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs\") pod \"network-metrics-daemon-hndjc\" (UID: \"cbe36bec-c099-4625-b8c9-eb67c281b442\") " pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:17:13.716707 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:13.716413 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:17:13.716707 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:13.716495 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs podName:cbe36bec-c099-4625-b8c9-eb67c281b442 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:29.716472588 +0000 UTC m=+34.222189233 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs") pod "network-metrics-daemon-hndjc" (UID: "cbe36bec-c099-4625-b8c9-eb67c281b442") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:17:13.817226 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:13.817179 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrz4r\" (UniqueName: \"kubernetes.io/projected/f9d394e5-59b9-48cb-b465-c8476cbd89d1-kube-api-access-hrz4r\") pod \"network-check-target-j22v7\" (UID: \"f9d394e5-59b9-48cb-b465-c8476cbd89d1\") " pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:17:13.817406 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:13.817348 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:17:13.817406 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:13.817370 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:17:13.817406 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:13.817381 2565 projected.go:194] Error preparing data for projected volume kube-api-access-hrz4r for pod openshift-network-diagnostics/network-check-target-j22v7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:17:13.817540 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:13.817436 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f9d394e5-59b9-48cb-b465-c8476cbd89d1-kube-api-access-hrz4r podName:f9d394e5-59b9-48cb-b465-c8476cbd89d1 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:29.817419249 +0000 UTC m=+34.323135910 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-hrz4r" (UniqueName: "kubernetes.io/projected/f9d394e5-59b9-48cb-b465-c8476cbd89d1-kube-api-access-hrz4r") pod "network-check-target-j22v7" (UID: "f9d394e5-59b9-48cb-b465-c8476cbd89d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:17:14.105909 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:14.105821 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:17:14.106071 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:14.105945 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j22v7" podUID="f9d394e5-59b9-48cb-b465-c8476cbd89d1" Apr 28 19:17:14.106071 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:14.106010 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:17:14.106221 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:14.106132 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hndjc" podUID="cbe36bec-c099-4625-b8c9-eb67c281b442" Apr 28 19:17:16.106910 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:16.106882 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:17:16.107255 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:16.106980 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hndjc" podUID="cbe36bec-c099-4625-b8c9-eb67c281b442" Apr 28 19:17:16.107255 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:16.107070 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:17:16.107255 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:16.107205 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j22v7" podUID="f9d394e5-59b9-48cb-b465-c8476cbd89d1" Apr 28 19:17:17.218045 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:17.217840 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ttwth" event={"ID":"784cb0c1-1c08-41f8-8c08-e92cdf0c70ce","Type":"ContainerStarted","Data":"8a219d1c0ec15452a0ceea1b0911d9769e7ea19cbea1945ab601ce7d98432ce6"} Apr 28 19:17:17.220174 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:17.220133 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" event={"ID":"46f624dd-71ff-4136-b6dd-c90053e2799c","Type":"ContainerStarted","Data":"39f273da394a228daea61de52d12ffced0b0372cd7d7d5d71a7c5098f3b4bcd2"} Apr 28 19:17:17.220259 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:17.220187 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" event={"ID":"46f624dd-71ff-4136-b6dd-c90053e2799c","Type":"ContainerStarted","Data":"f6e9dd13453fe28a05eb5228e36d2f154833d17a0f74b23944f866a9e6c539b3"} Apr 28 19:17:17.220259 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:17.220204 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" event={"ID":"46f624dd-71ff-4136-b6dd-c90053e2799c","Type":"ContainerStarted","Data":"aad0662cc14949ac091d169b4cc368be6821a2e9a36c4b1d433895e63c6de20f"} Apr 28 19:17:17.220259 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:17.220216 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" event={"ID":"46f624dd-71ff-4136-b6dd-c90053e2799c","Type":"ContainerStarted","Data":"1c1c10d681a83f9f7181a9be0f65fccd61e46acd7cd9821a95a8d65082d0c77e"} Apr 28 19:17:17.221454 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:17.221424 2565 generic.go:358] "Generic (PLEG): container finished" podID="2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6" containerID="45a49efb3c0531094c8683c7ae296f862e07c231266022e8cf1cafe21d0c0d22" exitCode=0 Apr 28 19:17:17.221536 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:17.221511 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ll4ff" event={"ID":"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6","Type":"ContainerDied","Data":"45a49efb3c0531094c8683c7ae296f862e07c231266022e8cf1cafe21d0c0d22"} Apr 28 19:17:17.225639 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:17.225599 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-64s26" event={"ID":"3b36aafe-8438-4826-93f5-10d39349a4f7","Type":"ContainerStarted","Data":"d6bb2dc99b22abb57033812543cb17e594a34437e713077696ce5ac6ec48df32"} Apr 28 19:17:17.226987 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:17.226945 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" event={"ID":"a12c3700-ac6d-481d-8acc-1ed63740cc01","Type":"ContainerStarted","Data":"2a645d04ca5f42d9db28ab0c79270d5a99de65b1235a14df72d99450741f3251"} Apr 28 19:17:17.228376 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:17.228349 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-x9j9g" event={"ID":"2df21b98-cdf7-4d04-a8e9-36920fde23bd","Type":"ContainerStarted","Data":"a7b1f95c113f85048a58717d4bfae24e5b5cbe2d4f1eed759fedcc1181d9f5cf"} Apr 28 19:17:17.229619 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:17.229598 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" event={"ID":"ce296200-b7fe-4c18-8f04-0fcf672d1613","Type":"ContainerStarted","Data":"7bc1427aeef53d4ba0c723339975d8da7b730d04e0750952b060ef2f51ee2e05"} Apr 28 19:17:17.230716 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:17.230695 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gg56k" event={"ID":"5b0e3ab7-8ea3-4aea-8300-4e7ecf70c550","Type":"ContainerStarted","Data":"5391699d7adfecf1cf7c9701054c1493dff545d390785a7d07f82782e05149ad"} Apr 28 19:17:17.237263 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:17.237231 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-41.ec2.internal" podStartSLOduration=20.23722044 podStartE2EDuration="20.23722044s" podCreationTimestamp="2026-04-28 19:16:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:17:01.226340565 +0000 UTC m=+5.732057232" watchObservedRunningTime="2026-04-28 19:17:17.23722044 +0000 UTC m=+21.742937109" Apr 28 19:17:17.272979 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:17.272935 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ttwth" podStartSLOduration=3.626767623 podStartE2EDuration="21.272917799s" podCreationTimestamp="2026-04-28 19:16:56 +0000 UTC" firstStartedPulling="2026-04-28 19:16:58.695692223 +0000 UTC m=+3.201408867" lastFinishedPulling="2026-04-28 19:17:16.341842384 +0000 UTC m=+20.847559043" observedRunningTime="2026-04-28 19:17:17.237760489 +0000 UTC m=+21.743477155" watchObservedRunningTime="2026-04-28 19:17:17.272917799 +0000 UTC m=+21.778634468" Apr 28 19:17:17.323821 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:17.323767 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-gg56k" podStartSLOduration=3.682558659 podStartE2EDuration="21.323747666s" podCreationTimestamp="2026-04-28 19:16:56 +0000 UTC" firstStartedPulling="2026-04-28 19:16:58.700304181 +0000 UTC m=+3.206020838" lastFinishedPulling="2026-04-28 19:17:16.341493187 +0000 UTC m=+20.847209845" observedRunningTime="2026-04-28 19:17:17.296188772 +0000 UTC m=+21.801905435" watchObservedRunningTime="2026-04-28 19:17:17.323747666 +0000 UTC m=+21.829464333" Apr 28 19:17:17.349232 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:17.349153 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-lsmn8" podStartSLOduration=3.692258259 podStartE2EDuration="21.349135529s" podCreationTimestamp="2026-04-28 19:16:56 +0000 UTC" firstStartedPulling="2026-04-28 19:16:58.687537572 +0000 UTC m=+3.193254234" lastFinishedPulling="2026-04-28 19:17:16.344414844 +0000 UTC m=+20.850131504" observedRunningTime="2026-04-28 19:17:17.34881842 +0000 UTC m=+21.854535121" watchObservedRunningTime="2026-04-28 19:17:17.349135529 +0000 UTC m=+21.854852196" Apr 28 19:17:17.349948 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:17.349915 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-64s26" podStartSLOduration=3.69595663 podStartE2EDuration="21.349903337s" podCreationTimestamp="2026-04-28 19:16:56 +0000 UTC" firstStartedPulling="2026-04-28 19:16:58.689470643 +0000 UTC m=+3.195187287" lastFinishedPulling="2026-04-28 19:17:16.343417333 +0000 UTC m=+20.849133994" observedRunningTime="2026-04-28 19:17:17.324222677 +0000 UTC m=+21.829939344" watchObservedRunningTime="2026-04-28 19:17:17.349903337 +0000 UTC m=+21.855620005" Apr 28 19:17:17.366366 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:17.366276 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-x9j9g" podStartSLOduration=12.518731056 podStartE2EDuration="21.366260717s" podCreationTimestamp="2026-04-28 19:16:56 +0000 UTC" firstStartedPulling="2026-04-28 19:16:58.687926218 +0000 UTC m=+3.193642863" lastFinishedPulling="2026-04-28 19:17:07.53545588 +0000 UTC m=+12.041172524" observedRunningTime="2026-04-28 19:17:17.365253581 +0000 UTC m=+21.870970258" watchObservedRunningTime="2026-04-28 19:17:17.366260717 +0000 UTC m=+21.871977385" Apr 28 19:17:17.534382 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:17.534359 2565 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 28 19:17:18.045726 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:18.045614 2565 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-28T19:17:17.534377942Z","UUID":"d385eae8-cd11-4a6b-a60c-cac288b51fc0","Handler":null,"Name":"","Endpoint":""} Apr 28 19:17:18.047386 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:18.047366 2565 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 28 19:17:18.047386 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:18.047396 2565 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 28 19:17:18.106238 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:18.106208 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:17:18.106421 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:18.106335 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j22v7" podUID="f9d394e5-59b9-48cb-b465-c8476cbd89d1" Apr 28 19:17:18.106421 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:18.106388 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:17:18.106543 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:18.106508 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hndjc" podUID="cbe36bec-c099-4625-b8c9-eb67c281b442" Apr 28 19:17:18.234194 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:18.234131 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hpgl5" event={"ID":"2e96850e-1476-4be5-9535-99fe12d6740c","Type":"ContainerStarted","Data":"cfba9d42ac4880757775dd1cea52e7c01b337b4dc622e7c1624381100898cc19"} Apr 28 19:17:18.237332 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:18.237300 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" event={"ID":"46f624dd-71ff-4136-b6dd-c90053e2799c","Type":"ContainerStarted","Data":"00892da0eb76454e5c6a7dcbb915e31be2699100b0cfd5ed96ad373b6aab8cf8"} Apr 28 19:17:18.237463 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:18.237335 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" event={"ID":"46f624dd-71ff-4136-b6dd-c90053e2799c","Type":"ContainerStarted","Data":"eea0c808b59be0f55cae52b72d2e29f9dc5a15dfb1292638c5bbc3f9a1ef71aa"} Apr 28 19:17:18.239313 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:18.239283 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" event={"ID":"a12c3700-ac6d-481d-8acc-1ed63740cc01","Type":"ContainerStarted","Data":"f677d2ba2404b9c5f06d096436969780d2ea278d540a45a8a3e7334d3b020784"} Apr 28 19:17:18.251986 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:18.251944 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-hpgl5" podStartSLOduration=4.610044336 podStartE2EDuration="22.251930471s" podCreationTimestamp="2026-04-28 19:16:56 +0000 UTC" firstStartedPulling="2026-04-28 19:16:58.699824825 +0000 UTC m=+3.205541468" lastFinishedPulling="2026-04-28 19:17:16.341710952 +0000 UTC m=+20.847427603" observedRunningTime="2026-04-28 19:17:18.251489838 +0000 UTC m=+22.757206504" watchObservedRunningTime="2026-04-28 19:17:18.251930471 +0000 UTC m=+22.757647137" Apr 28 19:17:19.243227 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:19.243189 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" event={"ID":"a12c3700-ac6d-481d-8acc-1ed63740cc01","Type":"ContainerStarted","Data":"751f1825bcd8d00a85e6ba1a0758e021e5a97d7a8d393fd8b887caabb4241ee6"} Apr 28 19:17:19.272363 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:19.272309 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cq2pw" podStartSLOduration=3.585993302 podStartE2EDuration="23.272293646s" podCreationTimestamp="2026-04-28 19:16:56 +0000 UTC" firstStartedPulling="2026-04-28 19:16:58.688587361 +0000 UTC m=+3.194304008" lastFinishedPulling="2026-04-28 19:17:18.374887702 +0000 UTC m=+22.880604352" observedRunningTime="2026-04-28 19:17:19.270456371 +0000 UTC m=+23.776173051" watchObservedRunningTime="2026-04-28 19:17:19.272293646 +0000 UTC m=+23.778010309" Apr 28 19:17:19.312831 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:19.312806 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-x9j9g" Apr 28 19:17:19.438772 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:19.438741 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-x9j9g" Apr 28 19:17:19.439399 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:19.439346 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-x9j9g" Apr 28 19:17:20.106365 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:20.106337 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:17:20.106544 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:20.106464 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hndjc" podUID="cbe36bec-c099-4625-b8c9-eb67c281b442" Apr 28 19:17:20.106544 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:20.106522 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:17:20.106729 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:20.106630 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j22v7" podUID="f9d394e5-59b9-48cb-b465-c8476cbd89d1" Apr 28 19:17:20.245690 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:20.245641 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-x9j9g" Apr 28 19:17:21.250242 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:21.250030 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" event={"ID":"46f624dd-71ff-4136-b6dd-c90053e2799c","Type":"ContainerStarted","Data":"fbbd631671e285e8b77f10891f5f4e7452503b790ef44fc9d38def8e8d9935ac"} Apr 28 19:17:22.105963 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:22.105929 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:17:22.106130 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:22.105937 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:17:22.106130 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:22.106036 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hndjc" podUID="cbe36bec-c099-4625-b8c9-eb67c281b442" Apr 28 19:17:22.106130 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:22.106111 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j22v7" podUID="f9d394e5-59b9-48cb-b465-c8476cbd89d1" Apr 28 19:17:22.253693 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:22.253659 2565 generic.go:358] "Generic (PLEG): container finished" podID="2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6" containerID="e8c997e57b23577daea7a915894c9a815c8affd31dcf00000c0109cb143df71a" exitCode=0 Apr 28 19:17:22.254269 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:22.253729 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ll4ff" event={"ID":"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6","Type":"ContainerDied","Data":"e8c997e57b23577daea7a915894c9a815c8affd31dcf00000c0109cb143df71a"} Apr 28 19:17:23.259496 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:23.259281 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" event={"ID":"46f624dd-71ff-4136-b6dd-c90053e2799c","Type":"ContainerStarted","Data":"178cbced64e82e01b49066df05090c0764496a87e0e11781b134359b5a0fcddb"} Apr 28 19:17:23.260135 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:23.259643 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:17:23.260135 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:23.259693 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:17:23.261419 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:23.261392 2565 generic.go:358] "Generic (PLEG): container finished" podID="2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6" containerID="220ae31f03e6b8111c95b5c2452a4c9a39b4f89751824a7d7d52d6901c64dba1" exitCode=0 Apr 28 19:17:23.261525 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:23.261446 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ll4ff" event={"ID":"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6","Type":"ContainerDied","Data":"220ae31f03e6b8111c95b5c2452a4c9a39b4f89751824a7d7d52d6901c64dba1"} Apr 28 19:17:23.274482 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:23.274460 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:17:23.297347 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:23.297300 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" podStartSLOduration=9.136640498 podStartE2EDuration="27.29728875s" podCreationTimestamp="2026-04-28 19:16:56 +0000 UTC" firstStartedPulling="2026-04-28 19:16:58.694718238 +0000 UTC m=+3.200434883" lastFinishedPulling="2026-04-28 19:17:16.855366491 +0000 UTC m=+21.361083135" observedRunningTime="2026-04-28 19:17:23.294824713 +0000 UTC m=+27.800541379" watchObservedRunningTime="2026-04-28 19:17:23.29728875 +0000 UTC m=+27.803005416" Apr 28 19:17:24.105735 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:24.105657 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:17:24.105854 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:24.105760 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hndjc" podUID="cbe36bec-c099-4625-b8c9-eb67c281b442" Apr 28 19:17:24.105854 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:24.105841 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:17:24.105938 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:24.105921 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j22v7" podUID="f9d394e5-59b9-48cb-b465-c8476cbd89d1" Apr 28 19:17:24.265222 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:24.265187 2565 generic.go:358] "Generic (PLEG): container finished" podID="2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6" containerID="bff13a5a88d628a4da0c2131711135f455e9b37d9b5ccaeadd6ff1bfefdaaf02" exitCode=0 Apr 28 19:17:24.265580 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:24.265275 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ll4ff" event={"ID":"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6","Type":"ContainerDied","Data":"bff13a5a88d628a4da0c2131711135f455e9b37d9b5ccaeadd6ff1bfefdaaf02"} Apr 28 19:17:24.268144 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:24.267461 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:17:24.284661 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:24.284635 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:17:25.323323 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:25.323134 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-j22v7"] Apr 28 19:17:25.323323 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:25.323298 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:17:25.323861 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:25.323404 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j22v7" podUID="f9d394e5-59b9-48cb-b465-c8476cbd89d1" Apr 28 19:17:25.325946 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:25.325674 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hndjc"] Apr 28 19:17:25.325946 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:25.325803 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:17:25.325946 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:25.325908 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hndjc" podUID="cbe36bec-c099-4625-b8c9-eb67c281b442" Apr 28 19:17:27.106574 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:27.106542 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:17:27.106957 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:27.106542 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:17:27.106957 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:27.106665 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j22v7" podUID="f9d394e5-59b9-48cb-b465-c8476cbd89d1" Apr 28 19:17:27.106957 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:27.106752 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hndjc" podUID="cbe36bec-c099-4625-b8c9-eb67c281b442" Apr 28 19:17:29.106330 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.106094 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:17:29.106771 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.106094 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:17:29.106771 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:29.106424 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j22v7" podUID="f9d394e5-59b9-48cb-b465-c8476cbd89d1" Apr 28 19:17:29.106771 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:29.106578 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hndjc" podUID="cbe36bec-c099-4625-b8c9-eb67c281b442" Apr 28 19:17:29.364467 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.364378 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-41.ec2.internal" event="NodeReady" Apr 28 19:17:29.364614 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.364518 2565 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 28 19:17:29.424645 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.424609 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-2jwqs"] Apr 28 19:17:29.445983 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.445952 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-w7t9f"] Apr 28 19:17:29.446182 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.446148 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2jwqs" Apr 28 19:17:29.449148 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.449125 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jtx2j\"" Apr 28 19:17:29.449291 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.449195 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 28 19:17:29.449291 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.449222 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 28 19:17:29.463201 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.463180 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-w7t9f"] Apr 28 19:17:29.463201 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.463207 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2jwqs"] Apr 28 19:17:29.463378 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.463283 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-w7t9f" Apr 28 19:17:29.466177 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.466114 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 28 19:17:29.466177 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.466156 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-524nx\"" Apr 28 19:17:29.466365 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.466344 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 28 19:17:29.466728 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.466706 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 28 19:17:29.540847 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.540798 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9fjq\" (UniqueName: \"kubernetes.io/projected/4d5f310c-a755-4af1-8570-335ac92bb8cf-kube-api-access-z9fjq\") pod \"dns-default-2jwqs\" (UID: \"4d5f310c-a755-4af1-8570-335ac92bb8cf\") " pod="openshift-dns/dns-default-2jwqs" Apr 28 19:17:29.541015 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.540875 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4d5f310c-a755-4af1-8570-335ac92bb8cf-tmp-dir\") pod \"dns-default-2jwqs\" (UID: \"4d5f310c-a755-4af1-8570-335ac92bb8cf\") " pod="openshift-dns/dns-default-2jwqs" Apr 28 19:17:29.541015 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.540906 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d5f310c-a755-4af1-8570-335ac92bb8cf-config-volume\") pod \"dns-default-2jwqs\" (UID: \"4d5f310c-a755-4af1-8570-335ac92bb8cf\") " pod="openshift-dns/dns-default-2jwqs" Apr 28 19:17:29.541015 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.540937 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d5f310c-a755-4af1-8570-335ac92bb8cf-metrics-tls\") pod \"dns-default-2jwqs\" (UID: \"4d5f310c-a755-4af1-8570-335ac92bb8cf\") " pod="openshift-dns/dns-default-2jwqs" Apr 28 19:17:29.642279 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.642195 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d5f310c-a755-4af1-8570-335ac92bb8cf-metrics-tls\") pod \"dns-default-2jwqs\" (UID: \"4d5f310c-a755-4af1-8570-335ac92bb8cf\") " pod="openshift-dns/dns-default-2jwqs" Apr 28 19:17:29.642468 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.642282 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f0dd845-b66a-4d78-b7a3-811ca24028e4-cert\") pod \"ingress-canary-w7t9f\" (UID: \"4f0dd845-b66a-4d78-b7a3-811ca24028e4\") " pod="openshift-ingress-canary/ingress-canary-w7t9f" Apr 28 19:17:29.642468 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.642315 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9fjq\" (UniqueName: \"kubernetes.io/projected/4d5f310c-a755-4af1-8570-335ac92bb8cf-kube-api-access-z9fjq\") pod \"dns-default-2jwqs\" (UID: \"4d5f310c-a755-4af1-8570-335ac92bb8cf\") " pod="openshift-dns/dns-default-2jwqs" Apr 28 19:17:29.642468 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:29.642355 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:29.642468 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:29.642434 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d5f310c-a755-4af1-8570-335ac92bb8cf-metrics-tls podName:4d5f310c-a755-4af1-8570-335ac92bb8cf nodeName:}" failed. No retries permitted until 2026-04-28 19:17:30.14241254 +0000 UTC m=+34.648129202 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4d5f310c-a755-4af1-8570-335ac92bb8cf-metrics-tls") pod "dns-default-2jwqs" (UID: "4d5f310c-a755-4af1-8570-335ac92bb8cf") : secret "dns-default-metrics-tls" not found Apr 28 19:17:29.642468 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.642364 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4d5f310c-a755-4af1-8570-335ac92bb8cf-tmp-dir\") pod \"dns-default-2jwqs\" (UID: \"4d5f310c-a755-4af1-8570-335ac92bb8cf\") " pod="openshift-dns/dns-default-2jwqs" Apr 28 19:17:29.642752 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.642725 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqlxd\" (UniqueName: \"kubernetes.io/projected/4f0dd845-b66a-4d78-b7a3-811ca24028e4-kube-api-access-pqlxd\") pod \"ingress-canary-w7t9f\" (UID: \"4f0dd845-b66a-4d78-b7a3-811ca24028e4\") " pod="openshift-ingress-canary/ingress-canary-w7t9f" Apr 28 19:17:29.642752 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.642742 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4d5f310c-a755-4af1-8570-335ac92bb8cf-tmp-dir\") pod \"dns-default-2jwqs\" (UID: \"4d5f310c-a755-4af1-8570-335ac92bb8cf\") " pod="openshift-dns/dns-default-2jwqs" Apr 28 19:17:29.642927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.642771 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d5f310c-a755-4af1-8570-335ac92bb8cf-config-volume\") pod \"dns-default-2jwqs\" (UID: \"4d5f310c-a755-4af1-8570-335ac92bb8cf\") " pod="openshift-dns/dns-default-2jwqs" Apr 28 19:17:29.643349 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.643327 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d5f310c-a755-4af1-8570-335ac92bb8cf-config-volume\") pod \"dns-default-2jwqs\" (UID: \"4d5f310c-a755-4af1-8570-335ac92bb8cf\") " pod="openshift-dns/dns-default-2jwqs" Apr 28 19:17:29.657099 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.657073 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9fjq\" (UniqueName: \"kubernetes.io/projected/4d5f310c-a755-4af1-8570-335ac92bb8cf-kube-api-access-z9fjq\") pod \"dns-default-2jwqs\" (UID: \"4d5f310c-a755-4af1-8570-335ac92bb8cf\") " pod="openshift-dns/dns-default-2jwqs" Apr 28 19:17:29.744036 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.744010 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pqlxd\" (UniqueName: \"kubernetes.io/projected/4f0dd845-b66a-4d78-b7a3-811ca24028e4-kube-api-access-pqlxd\") pod \"ingress-canary-w7t9f\" (UID: \"4f0dd845-b66a-4d78-b7a3-811ca24028e4\") " pod="openshift-ingress-canary/ingress-canary-w7t9f" Apr 28 19:17:29.744155 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.744067 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs\") pod \"network-metrics-daemon-hndjc\" (UID: \"cbe36bec-c099-4625-b8c9-eb67c281b442\") " pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:17:29.744222 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.744173 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f0dd845-b66a-4d78-b7a3-811ca24028e4-cert\") pod \"ingress-canary-w7t9f\" (UID: \"4f0dd845-b66a-4d78-b7a3-811ca24028e4\") " pod="openshift-ingress-canary/ingress-canary-w7t9f" Apr 28 19:17:29.744264 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:29.744242 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:17:29.744303 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:29.744261 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:29.744350 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:29.744315 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs podName:cbe36bec-c099-4625-b8c9-eb67c281b442 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:01.744295795 +0000 UTC m=+66.250012451 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs") pod "network-metrics-daemon-hndjc" (UID: "cbe36bec-c099-4625-b8c9-eb67c281b442") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:17:29.744350 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:29.744334 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f0dd845-b66a-4d78-b7a3-811ca24028e4-cert podName:4f0dd845-b66a-4d78-b7a3-811ca24028e4 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:30.244325569 +0000 UTC m=+34.750042223 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4f0dd845-b66a-4d78-b7a3-811ca24028e4-cert") pod "ingress-canary-w7t9f" (UID: "4f0dd845-b66a-4d78-b7a3-811ca24028e4") : secret "canary-serving-cert" not found Apr 28 19:17:29.752807 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.752784 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqlxd\" (UniqueName: \"kubernetes.io/projected/4f0dd845-b66a-4d78-b7a3-811ca24028e4-kube-api-access-pqlxd\") pod \"ingress-canary-w7t9f\" (UID: \"4f0dd845-b66a-4d78-b7a3-811ca24028e4\") " pod="openshift-ingress-canary/ingress-canary-w7t9f" Apr 28 19:17:29.845064 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:29.845028 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrz4r\" (UniqueName: \"kubernetes.io/projected/f9d394e5-59b9-48cb-b465-c8476cbd89d1-kube-api-access-hrz4r\") pod \"network-check-target-j22v7\" (UID: \"f9d394e5-59b9-48cb-b465-c8476cbd89d1\") " pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:17:29.845226 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:29.845199 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:17:29.845226 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:29.845219 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:17:29.845311 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:29.845230 2565 projected.go:194] Error preparing data for projected volume kube-api-access-hrz4r for pod openshift-network-diagnostics/network-check-target-j22v7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:17:29.845311 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:29.845281 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f9d394e5-59b9-48cb-b465-c8476cbd89d1-kube-api-access-hrz4r podName:f9d394e5-59b9-48cb-b465-c8476cbd89d1 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:01.845266991 +0000 UTC m=+66.350983634 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-hrz4r" (UniqueName: "kubernetes.io/projected/f9d394e5-59b9-48cb-b465-c8476cbd89d1-kube-api-access-hrz4r") pod "network-check-target-j22v7" (UID: "f9d394e5-59b9-48cb-b465-c8476cbd89d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:17:30.147445 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:30.147423 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d5f310c-a755-4af1-8570-335ac92bb8cf-metrics-tls\") pod \"dns-default-2jwqs\" (UID: \"4d5f310c-a755-4af1-8570-335ac92bb8cf\") " pod="openshift-dns/dns-default-2jwqs" Apr 28 19:17:30.147846 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:30.147533 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:30.147846 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:30.147598 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d5f310c-a755-4af1-8570-335ac92bb8cf-metrics-tls podName:4d5f310c-a755-4af1-8570-335ac92bb8cf nodeName:}" failed. No retries permitted until 2026-04-28 19:17:31.147584072 +0000 UTC m=+35.653300729 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4d5f310c-a755-4af1-8570-335ac92bb8cf-metrics-tls") pod "dns-default-2jwqs" (UID: "4d5f310c-a755-4af1-8570-335ac92bb8cf") : secret "dns-default-metrics-tls" not found Apr 28 19:17:30.248052 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:30.248024 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f0dd845-b66a-4d78-b7a3-811ca24028e4-cert\") pod \"ingress-canary-w7t9f\" (UID: \"4f0dd845-b66a-4d78-b7a3-811ca24028e4\") " pod="openshift-ingress-canary/ingress-canary-w7t9f" Apr 28 19:17:30.248195 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:30.248127 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:30.248195 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:30.248192 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f0dd845-b66a-4d78-b7a3-811ca24028e4-cert podName:4f0dd845-b66a-4d78-b7a3-811ca24028e4 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:31.248177636 +0000 UTC m=+35.753894301 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4f0dd845-b66a-4d78-b7a3-811ca24028e4-cert") pod "ingress-canary-w7t9f" (UID: "4f0dd845-b66a-4d78-b7a3-811ca24028e4") : secret "canary-serving-cert" not found Apr 28 19:17:31.105854 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:31.105822 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:17:31.106022 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:31.105822 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:17:31.110480 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:31.110455 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-cnh2p\"" Apr 28 19:17:31.110480 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:31.110471 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-9tnjq\"" Apr 28 19:17:31.110671 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:31.110497 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 28 19:17:31.110671 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:31.110456 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 28 19:17:31.110671 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:31.110588 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 28 19:17:31.155044 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:31.155021 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d5f310c-a755-4af1-8570-335ac92bb8cf-metrics-tls\") pod \"dns-default-2jwqs\" (UID: \"4d5f310c-a755-4af1-8570-335ac92bb8cf\") " pod="openshift-dns/dns-default-2jwqs" Apr 28 19:17:31.155525 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:31.155123 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:31.155525 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:31.155195 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d5f310c-a755-4af1-8570-335ac92bb8cf-metrics-tls podName:4d5f310c-a755-4af1-8570-335ac92bb8cf nodeName:}" failed. No retries permitted until 2026-04-28 19:17:33.155182246 +0000 UTC m=+37.660898890 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4d5f310c-a755-4af1-8570-335ac92bb8cf-metrics-tls") pod "dns-default-2jwqs" (UID: "4d5f310c-a755-4af1-8570-335ac92bb8cf") : secret "dns-default-metrics-tls" not found Apr 28 19:17:31.256428 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:31.256385 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f0dd845-b66a-4d78-b7a3-811ca24028e4-cert\") pod \"ingress-canary-w7t9f\" (UID: \"4f0dd845-b66a-4d78-b7a3-811ca24028e4\") " pod="openshift-ingress-canary/ingress-canary-w7t9f" Apr 28 19:17:31.256585 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:31.256513 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:31.256585 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:31.256576 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f0dd845-b66a-4d78-b7a3-811ca24028e4-cert podName:4f0dd845-b66a-4d78-b7a3-811ca24028e4 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:33.256563366 +0000 UTC m=+37.762280011 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4f0dd845-b66a-4d78-b7a3-811ca24028e4-cert") pod "ingress-canary-w7t9f" (UID: "4f0dd845-b66a-4d78-b7a3-811ca24028e4") : secret "canary-serving-cert" not found Apr 28 19:17:31.280855 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:31.280821 2565 generic.go:358] "Generic (PLEG): container finished" podID="2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6" containerID="2354e0fd079d751aa5e5bcd01862feab858c4de3d32112cf7e7f48578521c5e6" exitCode=0 Apr 28 19:17:31.280971 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:31.280890 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ll4ff" event={"ID":"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6","Type":"ContainerDied","Data":"2354e0fd079d751aa5e5bcd01862feab858c4de3d32112cf7e7f48578521c5e6"} Apr 28 19:17:32.285390 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:32.285348 2565 generic.go:358] "Generic (PLEG): container finished" podID="2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6" containerID="32c6362141bef2509d0b47843a43e7c0189e117576f0604135f13ba75595968c" exitCode=0 Apr 28 19:17:32.285799 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:32.285419 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ll4ff" event={"ID":"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6","Type":"ContainerDied","Data":"32c6362141bef2509d0b47843a43e7c0189e117576f0604135f13ba75595968c"} Apr 28 19:17:33.170890 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:33.170846 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d5f310c-a755-4af1-8570-335ac92bb8cf-metrics-tls\") pod \"dns-default-2jwqs\" (UID: \"4d5f310c-a755-4af1-8570-335ac92bb8cf\") " pod="openshift-dns/dns-default-2jwqs" Apr 28 19:17:33.171060 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:33.170993 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:33.171060 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:33.171054 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d5f310c-a755-4af1-8570-335ac92bb8cf-metrics-tls podName:4d5f310c-a755-4af1-8570-335ac92bb8cf nodeName:}" failed. No retries permitted until 2026-04-28 19:17:37.171037039 +0000 UTC m=+41.676753706 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4d5f310c-a755-4af1-8570-335ac92bb8cf-metrics-tls") pod "dns-default-2jwqs" (UID: "4d5f310c-a755-4af1-8570-335ac92bb8cf") : secret "dns-default-metrics-tls" not found Apr 28 19:17:33.271794 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:33.271758 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f0dd845-b66a-4d78-b7a3-811ca24028e4-cert\") pod \"ingress-canary-w7t9f\" (UID: \"4f0dd845-b66a-4d78-b7a3-811ca24028e4\") " pod="openshift-ingress-canary/ingress-canary-w7t9f" Apr 28 19:17:33.271935 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:33.271916 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:33.271988 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:33.271979 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f0dd845-b66a-4d78-b7a3-811ca24028e4-cert podName:4f0dd845-b66a-4d78-b7a3-811ca24028e4 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:37.271964362 +0000 UTC m=+41.777681011 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4f0dd845-b66a-4d78-b7a3-811ca24028e4-cert") pod "ingress-canary-w7t9f" (UID: "4f0dd845-b66a-4d78-b7a3-811ca24028e4") : secret "canary-serving-cert" not found Apr 28 19:17:33.289481 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:33.289453 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ll4ff" event={"ID":"2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6","Type":"ContainerStarted","Data":"97b5b37a33b2e66959d541aec5d6fc9348882ef8f2c692871edf22e76d3dc049"} Apr 28 19:17:33.318676 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:33.318635 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ll4ff" podStartSLOduration=5.860976207 podStartE2EDuration="37.318620674s" podCreationTimestamp="2026-04-28 19:16:56 +0000 UTC" firstStartedPulling="2026-04-28 19:16:58.691433852 +0000 UTC m=+3.197150496" lastFinishedPulling="2026-04-28 19:17:30.14907832 +0000 UTC m=+34.654794963" observedRunningTime="2026-04-28 19:17:33.317614714 +0000 UTC m=+37.823331381" watchObservedRunningTime="2026-04-28 19:17:33.318620674 +0000 UTC m=+37.824337339" Apr 28 19:17:37.200748 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:37.200706 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d5f310c-a755-4af1-8570-335ac92bb8cf-metrics-tls\") pod \"dns-default-2jwqs\" (UID: \"4d5f310c-a755-4af1-8570-335ac92bb8cf\") " pod="openshift-dns/dns-default-2jwqs" Apr 28 19:17:37.201121 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:37.200822 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:37.201121 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:37.200874 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d5f310c-a755-4af1-8570-335ac92bb8cf-metrics-tls podName:4d5f310c-a755-4af1-8570-335ac92bb8cf nodeName:}" failed. No retries permitted until 2026-04-28 19:17:45.200860158 +0000 UTC m=+49.706576802 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4d5f310c-a755-4af1-8570-335ac92bb8cf-metrics-tls") pod "dns-default-2jwqs" (UID: "4d5f310c-a755-4af1-8570-335ac92bb8cf") : secret "dns-default-metrics-tls" not found Apr 28 19:17:37.301606 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:37.301579 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f0dd845-b66a-4d78-b7a3-811ca24028e4-cert\") pod \"ingress-canary-w7t9f\" (UID: \"4f0dd845-b66a-4d78-b7a3-811ca24028e4\") " pod="openshift-ingress-canary/ingress-canary-w7t9f" Apr 28 19:17:37.301763 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:37.301717 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:37.301803 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:37.301786 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f0dd845-b66a-4d78-b7a3-811ca24028e4-cert podName:4f0dd845-b66a-4d78-b7a3-811ca24028e4 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:45.30177087 +0000 UTC m=+49.807487515 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4f0dd845-b66a-4d78-b7a3-811ca24028e4-cert") pod "ingress-canary-w7t9f" (UID: "4f0dd845-b66a-4d78-b7a3-811ca24028e4") : secret "canary-serving-cert" not found Apr 28 19:17:45.255062 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:45.255020 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d5f310c-a755-4af1-8570-335ac92bb8cf-metrics-tls\") pod \"dns-default-2jwqs\" (UID: \"4d5f310c-a755-4af1-8570-335ac92bb8cf\") " pod="openshift-dns/dns-default-2jwqs" Apr 28 19:17:45.255466 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:45.255190 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:45.255466 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:45.255255 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d5f310c-a755-4af1-8570-335ac92bb8cf-metrics-tls podName:4d5f310c-a755-4af1-8570-335ac92bb8cf nodeName:}" failed. No retries permitted until 2026-04-28 19:18:01.255238273 +0000 UTC m=+65.760954921 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4d5f310c-a755-4af1-8570-335ac92bb8cf-metrics-tls") pod "dns-default-2jwqs" (UID: "4d5f310c-a755-4af1-8570-335ac92bb8cf") : secret "dns-default-metrics-tls" not found Apr 28 19:17:45.355428 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:45.355396 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f0dd845-b66a-4d78-b7a3-811ca24028e4-cert\") pod \"ingress-canary-w7t9f\" (UID: \"4f0dd845-b66a-4d78-b7a3-811ca24028e4\") " pod="openshift-ingress-canary/ingress-canary-w7t9f" Apr 28 19:17:45.355575 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:45.355504 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:45.355575 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:17:45.355557 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f0dd845-b66a-4d78-b7a3-811ca24028e4-cert podName:4f0dd845-b66a-4d78-b7a3-811ca24028e4 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:01.35554144 +0000 UTC m=+65.861258101 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4f0dd845-b66a-4d78-b7a3-811ca24028e4-cert") pod "ingress-canary-w7t9f" (UID: "4f0dd845-b66a-4d78-b7a3-811ca24028e4") : secret "canary-serving-cert" not found Apr 28 19:17:56.279043 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:17:56.279015 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pqj2n" Apr 28 19:18:01.258176 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:18:01.258128 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d5f310c-a755-4af1-8570-335ac92bb8cf-metrics-tls\") pod \"dns-default-2jwqs\" (UID: \"4d5f310c-a755-4af1-8570-335ac92bb8cf\") " pod="openshift-dns/dns-default-2jwqs" Apr 28 19:18:01.258575 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:18:01.258277 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:18:01.258575 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:18:01.258344 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d5f310c-a755-4af1-8570-335ac92bb8cf-metrics-tls podName:4d5f310c-a755-4af1-8570-335ac92bb8cf nodeName:}" failed. No retries permitted until 2026-04-28 19:18:33.25832898 +0000 UTC m=+97.764045628 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4d5f310c-a755-4af1-8570-335ac92bb8cf-metrics-tls") pod "dns-default-2jwqs" (UID: "4d5f310c-a755-4af1-8570-335ac92bb8cf") : secret "dns-default-metrics-tls" not found Apr 28 19:18:01.359178 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:18:01.359127 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f0dd845-b66a-4d78-b7a3-811ca24028e4-cert\") pod \"ingress-canary-w7t9f\" (UID: \"4f0dd845-b66a-4d78-b7a3-811ca24028e4\") " pod="openshift-ingress-canary/ingress-canary-w7t9f" Apr 28 19:18:01.359337 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:18:01.359268 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:18:01.359337 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:18:01.359336 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f0dd845-b66a-4d78-b7a3-811ca24028e4-cert podName:4f0dd845-b66a-4d78-b7a3-811ca24028e4 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:33.359321139 +0000 UTC m=+97.865037783 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4f0dd845-b66a-4d78-b7a3-811ca24028e4-cert") pod "ingress-canary-w7t9f" (UID: "4f0dd845-b66a-4d78-b7a3-811ca24028e4") : secret "canary-serving-cert" not found Apr 28 19:18:01.762362 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:18:01.762327 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs\") pod \"network-metrics-daemon-hndjc\" (UID: \"cbe36bec-c099-4625-b8c9-eb67c281b442\") " pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:18:01.765062 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:18:01.765045 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 28 19:18:01.773218 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:18:01.773201 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 28 19:18:01.773283 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:18:01.773271 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs podName:cbe36bec-c099-4625-b8c9-eb67c281b442 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:05.773253651 +0000 UTC m=+130.278970299 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs") pod "network-metrics-daemon-hndjc" (UID: "cbe36bec-c099-4625-b8c9-eb67c281b442") : secret "metrics-daemon-secret" not found Apr 28 19:18:01.862657 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:18:01.862623 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrz4r\" (UniqueName: \"kubernetes.io/projected/f9d394e5-59b9-48cb-b465-c8476cbd89d1-kube-api-access-hrz4r\") pod \"network-check-target-j22v7\" (UID: \"f9d394e5-59b9-48cb-b465-c8476cbd89d1\") " pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:18:01.867081 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:18:01.867059 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 28 19:18:01.878003 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:18:01.877983 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 28 19:18:01.886994 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:18:01.886966 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrz4r\" (UniqueName: \"kubernetes.io/projected/f9d394e5-59b9-48cb-b465-c8476cbd89d1-kube-api-access-hrz4r\") pod \"network-check-target-j22v7\" (UID: \"f9d394e5-59b9-48cb-b465-c8476cbd89d1\") " pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:18:02.022546 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:18:02.022464 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-9tnjq\"" Apr 28 19:18:02.029911 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:18:02.029889 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:18:02.250411 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:18:02.250379 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-j22v7"] Apr 28 19:18:02.255747 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:18:02.255711 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9d394e5_59b9_48cb_b465_c8476cbd89d1.slice/crio-db0a6b73738cdea701490945629dedd4e28ae8667443e559fb6e7149ef084e3f WatchSource:0}: Error finding container db0a6b73738cdea701490945629dedd4e28ae8667443e559fb6e7149ef084e3f: Status 404 returned error can't find the container with id db0a6b73738cdea701490945629dedd4e28ae8667443e559fb6e7149ef084e3f Apr 28 19:18:02.343672 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:18:02.343593 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-j22v7" event={"ID":"f9d394e5-59b9-48cb-b465-c8476cbd89d1","Type":"ContainerStarted","Data":"db0a6b73738cdea701490945629dedd4e28ae8667443e559fb6e7149ef084e3f"} Apr 28 19:18:05.351190 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:18:05.351140 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-j22v7" event={"ID":"f9d394e5-59b9-48cb-b465-c8476cbd89d1","Type":"ContainerStarted","Data":"a53fd6f353e949a171b1e50d5a1be795e8fb5f82c0cfabde6193c048eedaeca0"} Apr 28 19:18:05.351533 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:18:05.351244 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:18:05.371382 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:18:05.371339 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-j22v7" podStartSLOduration=66.468869785 podStartE2EDuration="1m9.371318352s" podCreationTimestamp="2026-04-28 19:16:56 +0000 UTC" firstStartedPulling="2026-04-28 19:18:02.257569923 +0000 UTC m=+66.763286567" lastFinishedPulling="2026-04-28 19:18:05.160018485 +0000 UTC m=+69.665735134" observedRunningTime="2026-04-28 19:18:05.369917274 +0000 UTC m=+69.875633940" watchObservedRunningTime="2026-04-28 19:18:05.371318352 +0000 UTC m=+69.877035009" Apr 28 19:18:33.279022 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:18:33.278990 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d5f310c-a755-4af1-8570-335ac92bb8cf-metrics-tls\") pod \"dns-default-2jwqs\" (UID: \"4d5f310c-a755-4af1-8570-335ac92bb8cf\") " pod="openshift-dns/dns-default-2jwqs" Apr 28 19:18:33.279586 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:18:33.279154 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:18:33.279586 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:18:33.279268 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d5f310c-a755-4af1-8570-335ac92bb8cf-metrics-tls podName:4d5f310c-a755-4af1-8570-335ac92bb8cf nodeName:}" failed. No retries permitted until 2026-04-28 19:19:37.279245086 +0000 UTC m=+161.784961741 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4d5f310c-a755-4af1-8570-335ac92bb8cf-metrics-tls") pod "dns-default-2jwqs" (UID: "4d5f310c-a755-4af1-8570-335ac92bb8cf") : secret "dns-default-metrics-tls" not found Apr 28 19:18:33.379740 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:18:33.379706 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f0dd845-b66a-4d78-b7a3-811ca24028e4-cert\") pod \"ingress-canary-w7t9f\" (UID: \"4f0dd845-b66a-4d78-b7a3-811ca24028e4\") " pod="openshift-ingress-canary/ingress-canary-w7t9f" Apr 28 19:18:33.379871 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:18:33.379814 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:18:33.379871 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:18:33.379870 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f0dd845-b66a-4d78-b7a3-811ca24028e4-cert podName:4f0dd845-b66a-4d78-b7a3-811ca24028e4 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:37.379856308 +0000 UTC m=+161.885572952 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4f0dd845-b66a-4d78-b7a3-811ca24028e4-cert") pod "ingress-canary-w7t9f" (UID: "4f0dd845-b66a-4d78-b7a3-811ca24028e4") : secret "canary-serving-cert" not found Apr 28 19:18:36.355820 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:18:36.355788 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-j22v7" Apr 28 19:19:05.796891 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:05.796844 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs\") pod \"network-metrics-daemon-hndjc\" (UID: \"cbe36bec-c099-4625-b8c9-eb67c281b442\") " pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:19:05.797417 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:05.796983 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 28 19:19:05.797417 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:05.797055 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs podName:cbe36bec-c099-4625-b8c9-eb67c281b442 nodeName:}" failed. No retries permitted until 2026-04-28 19:21:07.797036304 +0000 UTC m=+252.302752966 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs") pod "network-metrics-daemon-hndjc" (UID: "cbe36bec-c099-4625-b8c9-eb67c281b442") : secret "metrics-daemon-secret" not found Apr 28 19:19:11.173080 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.173050 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9sq48"] Apr 28 19:19:11.174913 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.174898 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9sq48" Apr 28 19:19:11.180275 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.180248 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 28 19:19:11.180419 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.180311 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 28 19:19:11.181421 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.181404 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-slqmc\"" Apr 28 19:19:11.181527 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.181509 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:19:11.197626 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.197601 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9sq48"] Apr 28 19:19:11.267233 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.267208 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bqqnv"] Apr 28 19:19:11.269060 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.269043 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bqqnv" Apr 28 19:19:11.271607 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.271587 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:19:11.271702 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.271667 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-6jcqr\"" Apr 28 19:19:11.272293 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.272275 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 28 19:19:11.284412 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.284393 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-d64467788-4mcrw"] Apr 28 19:19:11.286076 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.286060 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bqqnv"] Apr 28 19:19:11.286187 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.286146 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:11.289345 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.289325 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 28 19:19:11.289490 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.289374 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 28 19:19:11.289490 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.289398 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 28 19:19:11.289652 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.289632 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 28 19:19:11.289747 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.289637 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-sxstb\"" Apr 28 19:19:11.289878 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.289859 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 28 19:19:11.293225 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.293205 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 28 19:19:11.304391 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.304368 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-d64467788-4mcrw"] Apr 28 19:19:11.340333 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.340303 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzrkn\" (UniqueName: \"kubernetes.io/projected/59674cfb-c293-4b6c-8e89-c681d39465e1-kube-api-access-fzrkn\") pod \"cluster-samples-operator-6dc5bdb6b4-9sq48\" (UID: \"59674cfb-c293-4b6c-8e89-c681d39465e1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9sq48" Apr 28 19:19:11.340460 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.340359 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/59674cfb-c293-4b6c-8e89-c681d39465e1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-9sq48\" (UID: \"59674cfb-c293-4b6c-8e89-c681d39465e1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9sq48" Apr 28 19:19:11.368373 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.368343 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-mj7v4"] Apr 28 19:19:11.370067 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.370053 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-mj7v4" Apr 28 19:19:11.373037 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.373019 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 28 19:19:11.374692 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.374663 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-mphsq\"" Apr 28 19:19:11.374692 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.374686 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 28 19:19:11.374826 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.374703 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 28 19:19:11.374826 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.374689 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 28 19:19:11.387491 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.387467 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-mj7v4"] Apr 28 19:19:11.441039 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.440959 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/59674cfb-c293-4b6c-8e89-c681d39465e1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-9sq48\" (UID: \"59674cfb-c293-4b6c-8e89-c681d39465e1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9sq48" Apr 28 19:19:11.441195 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.441045 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/977bb0b7-3623-4100-ba3a-1b9d24046162-default-certificate\") pod \"router-default-d64467788-4mcrw\" (UID: \"977bb0b7-3623-4100-ba3a-1b9d24046162\") " pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:11.441195 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.441075 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/977bb0b7-3623-4100-ba3a-1b9d24046162-stats-auth\") pod \"router-default-d64467788-4mcrw\" (UID: \"977bb0b7-3623-4100-ba3a-1b9d24046162\") " pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:11.441195 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:11.441124 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 28 19:19:11.441195 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.441144 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5chm\" (UniqueName: \"kubernetes.io/projected/8d700fe5-5b61-4638-9883-cb767568ec47-kube-api-access-z5chm\") pod \"volume-data-source-validator-7c6cbb6c87-bqqnv\" (UID: \"8d700fe5-5b61-4638-9883-cb767568ec47\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bqqnv" Apr 28 19:19:11.441343 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.441216 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fzrkn\" (UniqueName: \"kubernetes.io/projected/59674cfb-c293-4b6c-8e89-c681d39465e1-kube-api-access-fzrkn\") pod \"cluster-samples-operator-6dc5bdb6b4-9sq48\" (UID: \"59674cfb-c293-4b6c-8e89-c681d39465e1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9sq48" Apr 28 19:19:11.441343 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:11.441248 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59674cfb-c293-4b6c-8e89-c681d39465e1-samples-operator-tls podName:59674cfb-c293-4b6c-8e89-c681d39465e1 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:11.9412312 +0000 UTC m=+136.446947844 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/59674cfb-c293-4b6c-8e89-c681d39465e1-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-9sq48" (UID: "59674cfb-c293-4b6c-8e89-c681d39465e1") : secret "samples-operator-tls" not found Apr 28 19:19:11.441425 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.441339 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/977bb0b7-3623-4100-ba3a-1b9d24046162-metrics-certs\") pod \"router-default-d64467788-4mcrw\" (UID: \"977bb0b7-3623-4100-ba3a-1b9d24046162\") " pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:11.441425 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.441374 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48nqb\" (UniqueName: \"kubernetes.io/projected/977bb0b7-3623-4100-ba3a-1b9d24046162-kube-api-access-48nqb\") pod \"router-default-d64467788-4mcrw\" (UID: \"977bb0b7-3623-4100-ba3a-1b9d24046162\") " pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:11.441425 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.441397 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/977bb0b7-3623-4100-ba3a-1b9d24046162-service-ca-bundle\") pod \"router-default-d64467788-4mcrw\" (UID: \"977bb0b7-3623-4100-ba3a-1b9d24046162\") " pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:11.454143 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.454121 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzrkn\" (UniqueName: \"kubernetes.io/projected/59674cfb-c293-4b6c-8e89-c681d39465e1-kube-api-access-fzrkn\") pod \"cluster-samples-operator-6dc5bdb6b4-9sq48\" (UID: \"59674cfb-c293-4b6c-8e89-c681d39465e1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9sq48" Apr 28 19:19:11.542413 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.542377 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z5chm\" (UniqueName: \"kubernetes.io/projected/8d700fe5-5b61-4638-9883-cb767568ec47-kube-api-access-z5chm\") pod \"volume-data-source-validator-7c6cbb6c87-bqqnv\" (UID: \"8d700fe5-5b61-4638-9883-cb767568ec47\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bqqnv" Apr 28 19:19:11.542609 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.542423 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/977bb0b7-3623-4100-ba3a-1b9d24046162-metrics-certs\") pod \"router-default-d64467788-4mcrw\" (UID: \"977bb0b7-3623-4100-ba3a-1b9d24046162\") " pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:11.542609 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:11.542517 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 28 19:19:11.542609 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.542535 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48nqb\" (UniqueName: \"kubernetes.io/projected/977bb0b7-3623-4100-ba3a-1b9d24046162-kube-api-access-48nqb\") pod \"router-default-d64467788-4mcrw\" (UID: \"977bb0b7-3623-4100-ba3a-1b9d24046162\") " pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:11.542609 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:11.542585 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/977bb0b7-3623-4100-ba3a-1b9d24046162-metrics-certs podName:977bb0b7-3623-4100-ba3a-1b9d24046162 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:12.04256452 +0000 UTC m=+136.548281167 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/977bb0b7-3623-4100-ba3a-1b9d24046162-metrics-certs") pod "router-default-d64467788-4mcrw" (UID: "977bb0b7-3623-4100-ba3a-1b9d24046162") : secret "router-metrics-certs-default" not found Apr 28 19:19:11.542609 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.542604 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/977bb0b7-3623-4100-ba3a-1b9d24046162-service-ca-bundle\") pod \"router-default-d64467788-4mcrw\" (UID: \"977bb0b7-3623-4100-ba3a-1b9d24046162\") " pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:11.542847 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.542634 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b167459-93b9-4e7b-bd66-94d693cab19e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-mj7v4\" (UID: \"9b167459-93b9-4e7b-bd66-94d693cab19e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-mj7v4" Apr 28 19:19:11.542847 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.542707 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/9b167459-93b9-4e7b-bd66-94d693cab19e-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-mj7v4\" (UID: \"9b167459-93b9-4e7b-bd66-94d693cab19e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-mj7v4" Apr 28 19:19:11.542847 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.542752 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbfb6\" (UniqueName: \"kubernetes.io/projected/9b167459-93b9-4e7b-bd66-94d693cab19e-kube-api-access-xbfb6\") pod \"cluster-monitoring-operator-75587bd455-mj7v4\" (UID: \"9b167459-93b9-4e7b-bd66-94d693cab19e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-mj7v4" Apr 28 19:19:11.542847 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:11.542758 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/977bb0b7-3623-4100-ba3a-1b9d24046162-service-ca-bundle podName:977bb0b7-3623-4100-ba3a-1b9d24046162 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:12.04274171 +0000 UTC m=+136.548458373 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/977bb0b7-3623-4100-ba3a-1b9d24046162-service-ca-bundle") pod "router-default-d64467788-4mcrw" (UID: "977bb0b7-3623-4100-ba3a-1b9d24046162") : configmap references non-existent config key: service-ca.crt Apr 28 19:19:11.542971 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.542847 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/977bb0b7-3623-4100-ba3a-1b9d24046162-default-certificate\") pod \"router-default-d64467788-4mcrw\" (UID: \"977bb0b7-3623-4100-ba3a-1b9d24046162\") " pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:11.542971 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.542866 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/977bb0b7-3623-4100-ba3a-1b9d24046162-stats-auth\") pod \"router-default-d64467788-4mcrw\" (UID: \"977bb0b7-3623-4100-ba3a-1b9d24046162\") " pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:11.545301 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.545271 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/977bb0b7-3623-4100-ba3a-1b9d24046162-default-certificate\") pod \"router-default-d64467788-4mcrw\" (UID: \"977bb0b7-3623-4100-ba3a-1b9d24046162\") " pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:11.545414 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.545317 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/977bb0b7-3623-4100-ba3a-1b9d24046162-stats-auth\") pod \"router-default-d64467788-4mcrw\" (UID: \"977bb0b7-3623-4100-ba3a-1b9d24046162\") " pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:11.556998 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.556974 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5chm\" (UniqueName: \"kubernetes.io/projected/8d700fe5-5b61-4638-9883-cb767568ec47-kube-api-access-z5chm\") pod \"volume-data-source-validator-7c6cbb6c87-bqqnv\" (UID: \"8d700fe5-5b61-4638-9883-cb767568ec47\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bqqnv" Apr 28 19:19:11.557098 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.557036 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-48nqb\" (UniqueName: \"kubernetes.io/projected/977bb0b7-3623-4100-ba3a-1b9d24046162-kube-api-access-48nqb\") pod \"router-default-d64467788-4mcrw\" (UID: \"977bb0b7-3623-4100-ba3a-1b9d24046162\") " pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:11.577541 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.577517 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bqqnv" Apr 28 19:19:11.643605 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.643572 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b167459-93b9-4e7b-bd66-94d693cab19e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-mj7v4\" (UID: \"9b167459-93b9-4e7b-bd66-94d693cab19e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-mj7v4" Apr 28 19:19:11.643761 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.643638 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/9b167459-93b9-4e7b-bd66-94d693cab19e-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-mj7v4\" (UID: \"9b167459-93b9-4e7b-bd66-94d693cab19e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-mj7v4" Apr 28 19:19:11.643761 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.643666 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbfb6\" (UniqueName: \"kubernetes.io/projected/9b167459-93b9-4e7b-bd66-94d693cab19e-kube-api-access-xbfb6\") pod \"cluster-monitoring-operator-75587bd455-mj7v4\" (UID: \"9b167459-93b9-4e7b-bd66-94d693cab19e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-mj7v4" Apr 28 19:19:11.643761 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:11.643747 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 28 19:19:11.643908 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:11.643829 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b167459-93b9-4e7b-bd66-94d693cab19e-cluster-monitoring-operator-tls podName:9b167459-93b9-4e7b-bd66-94d693cab19e nodeName:}" failed. No retries permitted until 2026-04-28 19:19:12.143808237 +0000 UTC m=+136.649524899 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9b167459-93b9-4e7b-bd66-94d693cab19e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-mj7v4" (UID: "9b167459-93b9-4e7b-bd66-94d693cab19e") : secret "cluster-monitoring-operator-tls" not found Apr 28 19:19:11.644418 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.644398 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/9b167459-93b9-4e7b-bd66-94d693cab19e-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-mj7v4\" (UID: \"9b167459-93b9-4e7b-bd66-94d693cab19e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-mj7v4" Apr 28 19:19:11.663566 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.663539 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbfb6\" (UniqueName: \"kubernetes.io/projected/9b167459-93b9-4e7b-bd66-94d693cab19e-kube-api-access-xbfb6\") pod \"cluster-monitoring-operator-75587bd455-mj7v4\" (UID: \"9b167459-93b9-4e7b-bd66-94d693cab19e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-mj7v4" Apr 28 19:19:11.689107 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.689077 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bqqnv"] Apr 28 19:19:11.694911 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:19:11.694888 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d700fe5_5b61_4638_9883_cb767568ec47.slice/crio-4feb9111bc1d83ac77fc730d250844ed9e5a56bceefb48c16a302f97f7d3b612 WatchSource:0}: Error finding container 4feb9111bc1d83ac77fc730d250844ed9e5a56bceefb48c16a302f97f7d3b612: Status 404 returned error can't find the container with id 4feb9111bc1d83ac77fc730d250844ed9e5a56bceefb48c16a302f97f7d3b612 Apr 28 19:19:11.947314 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:11.947235 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/59674cfb-c293-4b6c-8e89-c681d39465e1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-9sq48\" (UID: \"59674cfb-c293-4b6c-8e89-c681d39465e1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9sq48" Apr 28 19:19:11.947444 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:11.947374 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 28 19:19:11.947444 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:11.947436 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59674cfb-c293-4b6c-8e89-c681d39465e1-samples-operator-tls podName:59674cfb-c293-4b6c-8e89-c681d39465e1 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:12.947420681 +0000 UTC m=+137.453137330 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/59674cfb-c293-4b6c-8e89-c681d39465e1-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-9sq48" (UID: "59674cfb-c293-4b6c-8e89-c681d39465e1") : secret "samples-operator-tls" not found Apr 28 19:19:12.047621 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:12.047573 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/977bb0b7-3623-4100-ba3a-1b9d24046162-metrics-certs\") pod \"router-default-d64467788-4mcrw\" (UID: \"977bb0b7-3623-4100-ba3a-1b9d24046162\") " pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:12.047621 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:12.047627 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/977bb0b7-3623-4100-ba3a-1b9d24046162-service-ca-bundle\") pod \"router-default-d64467788-4mcrw\" (UID: \"977bb0b7-3623-4100-ba3a-1b9d24046162\") " pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:12.047817 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:12.047725 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 28 19:19:12.047817 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:12.047770 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/977bb0b7-3623-4100-ba3a-1b9d24046162-service-ca-bundle podName:977bb0b7-3623-4100-ba3a-1b9d24046162 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:13.04775123 +0000 UTC m=+137.553467893 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/977bb0b7-3623-4100-ba3a-1b9d24046162-service-ca-bundle") pod "router-default-d64467788-4mcrw" (UID: "977bb0b7-3623-4100-ba3a-1b9d24046162") : configmap references non-existent config key: service-ca.crt Apr 28 19:19:12.047817 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:12.047786 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/977bb0b7-3623-4100-ba3a-1b9d24046162-metrics-certs podName:977bb0b7-3623-4100-ba3a-1b9d24046162 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:13.047779065 +0000 UTC m=+137.553495709 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/977bb0b7-3623-4100-ba3a-1b9d24046162-metrics-certs") pod "router-default-d64467788-4mcrw" (UID: "977bb0b7-3623-4100-ba3a-1b9d24046162") : secret "router-metrics-certs-default" not found Apr 28 19:19:12.148362 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:12.148318 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b167459-93b9-4e7b-bd66-94d693cab19e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-mj7v4\" (UID: \"9b167459-93b9-4e7b-bd66-94d693cab19e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-mj7v4" Apr 28 19:19:12.148535 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:12.148507 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 28 19:19:12.148615 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:12.148602 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b167459-93b9-4e7b-bd66-94d693cab19e-cluster-monitoring-operator-tls podName:9b167459-93b9-4e7b-bd66-94d693cab19e nodeName:}" failed. No retries permitted until 2026-04-28 19:19:13.148580828 +0000 UTC m=+137.654297475 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9b167459-93b9-4e7b-bd66-94d693cab19e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-mj7v4" (UID: "9b167459-93b9-4e7b-bd66-94d693cab19e") : secret "cluster-monitoring-operator-tls" not found Apr 28 19:19:12.470110 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:12.470075 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bqqnv" event={"ID":"8d700fe5-5b61-4638-9883-cb767568ec47","Type":"ContainerStarted","Data":"4feb9111bc1d83ac77fc730d250844ed9e5a56bceefb48c16a302f97f7d3b612"} Apr 28 19:19:12.954476 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:12.954436 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/59674cfb-c293-4b6c-8e89-c681d39465e1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-9sq48\" (UID: \"59674cfb-c293-4b6c-8e89-c681d39465e1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9sq48" Apr 28 19:19:12.954631 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:12.954583 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 28 19:19:12.954671 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:12.954649 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59674cfb-c293-4b6c-8e89-c681d39465e1-samples-operator-tls podName:59674cfb-c293-4b6c-8e89-c681d39465e1 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:14.954633819 +0000 UTC m=+139.460350463 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/59674cfb-c293-4b6c-8e89-c681d39465e1-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-9sq48" (UID: "59674cfb-c293-4b6c-8e89-c681d39465e1") : secret "samples-operator-tls" not found Apr 28 19:19:13.053476 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.053442 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tnqkx"] Apr 28 19:19:13.055597 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.055580 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tnqkx" Apr 28 19:19:13.055675 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.055601 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/977bb0b7-3623-4100-ba3a-1b9d24046162-metrics-certs\") pod \"router-default-d64467788-4mcrw\" (UID: \"977bb0b7-3623-4100-ba3a-1b9d24046162\") " pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:13.055675 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.055636 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/977bb0b7-3623-4100-ba3a-1b9d24046162-service-ca-bundle\") pod \"router-default-d64467788-4mcrw\" (UID: \"977bb0b7-3623-4100-ba3a-1b9d24046162\") " pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:13.055864 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:13.055842 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 28 19:19:13.055924 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:13.055908 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/977bb0b7-3623-4100-ba3a-1b9d24046162-metrics-certs podName:977bb0b7-3623-4100-ba3a-1b9d24046162 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:15.055890312 +0000 UTC m=+139.561606977 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/977bb0b7-3623-4100-ba3a-1b9d24046162-metrics-certs") pod "router-default-d64467788-4mcrw" (UID: "977bb0b7-3623-4100-ba3a-1b9d24046162") : secret "router-metrics-certs-default" not found Apr 28 19:19:13.055978 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:13.055947 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/977bb0b7-3623-4100-ba3a-1b9d24046162-service-ca-bundle podName:977bb0b7-3623-4100-ba3a-1b9d24046162 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:15.055931701 +0000 UTC m=+139.561648345 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/977bb0b7-3623-4100-ba3a-1b9d24046162-service-ca-bundle") pod "router-default-d64467788-4mcrw" (UID: "977bb0b7-3623-4100-ba3a-1b9d24046162") : configmap references non-existent config key: service-ca.crt Apr 28 19:19:13.059221 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.059198 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-9qkbl\"" Apr 28 19:19:13.059221 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.059217 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 28 19:19:13.060431 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.060409 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:19:13.060431 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.060411 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 28 19:19:13.060579 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.060445 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 28 19:19:13.067425 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.067405 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tnqkx"] Apr 28 19:19:13.156559 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.156467 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b167459-93b9-4e7b-bd66-94d693cab19e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-mj7v4\" (UID: \"9b167459-93b9-4e7b-bd66-94d693cab19e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-mj7v4" Apr 28 19:19:13.156559 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.156506 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5whmd\" (UniqueName: \"kubernetes.io/projected/533b151b-982f-4582-9292-150aa20dc9df-kube-api-access-5whmd\") pod \"service-ca-operator-d6fc45fc5-tnqkx\" (UID: \"533b151b-982f-4582-9292-150aa20dc9df\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tnqkx" Apr 28 19:19:13.156559 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.156532 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/533b151b-982f-4582-9292-150aa20dc9df-serving-cert\") pod \"service-ca-operator-d6fc45fc5-tnqkx\" (UID: \"533b151b-982f-4582-9292-150aa20dc9df\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tnqkx" Apr 28 19:19:13.156559 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.156548 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/533b151b-982f-4582-9292-150aa20dc9df-config\") pod \"service-ca-operator-d6fc45fc5-tnqkx\" (UID: \"533b151b-982f-4582-9292-150aa20dc9df\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tnqkx" Apr 28 19:19:13.156811 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:13.156628 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 28 19:19:13.156811 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:13.156681 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b167459-93b9-4e7b-bd66-94d693cab19e-cluster-monitoring-operator-tls podName:9b167459-93b9-4e7b-bd66-94d693cab19e nodeName:}" failed. No retries permitted until 2026-04-28 19:19:15.156667333 +0000 UTC m=+139.662383977 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9b167459-93b9-4e7b-bd66-94d693cab19e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-mj7v4" (UID: "9b167459-93b9-4e7b-bd66-94d693cab19e") : secret "cluster-monitoring-operator-tls" not found Apr 28 19:19:13.257699 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.257672 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/533b151b-982f-4582-9292-150aa20dc9df-serving-cert\") pod \"service-ca-operator-d6fc45fc5-tnqkx\" (UID: \"533b151b-982f-4582-9292-150aa20dc9df\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tnqkx" Apr 28 19:19:13.257699 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.257698 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/533b151b-982f-4582-9292-150aa20dc9df-config\") pod \"service-ca-operator-d6fc45fc5-tnqkx\" (UID: \"533b151b-982f-4582-9292-150aa20dc9df\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tnqkx" Apr 28 19:19:13.258022 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.257976 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5whmd\" (UniqueName: \"kubernetes.io/projected/533b151b-982f-4582-9292-150aa20dc9df-kube-api-access-5whmd\") pod \"service-ca-operator-d6fc45fc5-tnqkx\" (UID: \"533b151b-982f-4582-9292-150aa20dc9df\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tnqkx" Apr 28 19:19:13.258247 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.258222 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/533b151b-982f-4582-9292-150aa20dc9df-config\") pod \"service-ca-operator-d6fc45fc5-tnqkx\" (UID: \"533b151b-982f-4582-9292-150aa20dc9df\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tnqkx" Apr 28 19:19:13.259899 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.259879 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/533b151b-982f-4582-9292-150aa20dc9df-serving-cert\") pod \"service-ca-operator-d6fc45fc5-tnqkx\" (UID: \"533b151b-982f-4582-9292-150aa20dc9df\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tnqkx" Apr 28 19:19:13.266534 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.266508 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5whmd\" (UniqueName: \"kubernetes.io/projected/533b151b-982f-4582-9292-150aa20dc9df-kube-api-access-5whmd\") pod \"service-ca-operator-d6fc45fc5-tnqkx\" (UID: \"533b151b-982f-4582-9292-150aa20dc9df\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tnqkx" Apr 28 19:19:13.363927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.363898 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tnqkx" Apr 28 19:19:13.473260 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.473226 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bqqnv" event={"ID":"8d700fe5-5b61-4638-9883-cb767568ec47","Type":"ContainerStarted","Data":"cf6be793fe29fd1ab11da6cc03de1126658bade450996269dd6de003493f9aba"} Apr 28 19:19:13.475564 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.475536 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tnqkx"] Apr 28 19:19:13.479837 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:19:13.479813 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod533b151b_982f_4582_9292_150aa20dc9df.slice/crio-6be39aa5e820dc0536d36d69389ec6286203facbbbc761c598159d4e910f1924 WatchSource:0}: Error finding container 6be39aa5e820dc0536d36d69389ec6286203facbbbc761c598159d4e910f1924: Status 404 returned error can't find the container with id 6be39aa5e820dc0536d36d69389ec6286203facbbbc761c598159d4e910f1924 Apr 28 19:19:13.489477 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.489436 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bqqnv" podStartSLOduration=1.35941311 podStartE2EDuration="2.489424399s" podCreationTimestamp="2026-04-28 19:19:11 +0000 UTC" firstStartedPulling="2026-04-28 19:19:11.69653932 +0000 UTC m=+136.202255963" lastFinishedPulling="2026-04-28 19:19:12.826550605 +0000 UTC m=+137.332267252" observedRunningTime="2026-04-28 19:19:13.488510331 +0000 UTC m=+137.994226996" watchObservedRunningTime="2026-04-28 19:19:13.489424399 +0000 UTC m=+137.995141065" Apr 28 19:19:13.718349 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.718267 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vffwj"] Apr 28 19:19:13.720791 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.720772 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vffwj" Apr 28 19:19:13.723473 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.723453 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-9w6l5\"" Apr 28 19:19:13.735176 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.735143 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vffwj"] Apr 28 19:19:13.863199 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.863143 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jqqr\" (UniqueName: \"kubernetes.io/projected/3daf2574-ff6f-4e8a-b1ed-11bf807c7403-kube-api-access-6jqqr\") pod \"network-check-source-8894fc9bd-vffwj\" (UID: \"3daf2574-ff6f-4e8a-b1ed-11bf807c7403\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vffwj" Apr 28 19:19:13.963904 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.963866 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jqqr\" (UniqueName: \"kubernetes.io/projected/3daf2574-ff6f-4e8a-b1ed-11bf807c7403-kube-api-access-6jqqr\") pod \"network-check-source-8894fc9bd-vffwj\" (UID: \"3daf2574-ff6f-4e8a-b1ed-11bf807c7403\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vffwj" Apr 28 19:19:13.972928 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:13.972862 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jqqr\" (UniqueName: \"kubernetes.io/projected/3daf2574-ff6f-4e8a-b1ed-11bf807c7403-kube-api-access-6jqqr\") pod \"network-check-source-8894fc9bd-vffwj\" (UID: \"3daf2574-ff6f-4e8a-b1ed-11bf807c7403\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vffwj" Apr 28 19:19:14.029066 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:14.029031 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vffwj" Apr 28 19:19:14.144699 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:14.144670 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vffwj"] Apr 28 19:19:14.146826 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:19:14.146799 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3daf2574_ff6f_4e8a_b1ed_11bf807c7403.slice/crio-aa24da0b563271bad57a26a186db27689b56dd3ee428def3cc6f38a311e0d1b8 WatchSource:0}: Error finding container aa24da0b563271bad57a26a186db27689b56dd3ee428def3cc6f38a311e0d1b8: Status 404 returned error can't find the container with id aa24da0b563271bad57a26a186db27689b56dd3ee428def3cc6f38a311e0d1b8 Apr 28 19:19:14.476351 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:14.476309 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vffwj" event={"ID":"3daf2574-ff6f-4e8a-b1ed-11bf807c7403","Type":"ContainerStarted","Data":"0157bea833a26153a3f263c2b94c764f3b540d53630626b94830cf0e8bb656ef"} Apr 28 19:19:14.476351 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:14.476348 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vffwj" event={"ID":"3daf2574-ff6f-4e8a-b1ed-11bf807c7403","Type":"ContainerStarted","Data":"aa24da0b563271bad57a26a186db27689b56dd3ee428def3cc6f38a311e0d1b8"} Apr 28 19:19:14.477481 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:14.477453 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tnqkx" event={"ID":"533b151b-982f-4582-9292-150aa20dc9df","Type":"ContainerStarted","Data":"6be39aa5e820dc0536d36d69389ec6286203facbbbc761c598159d4e910f1924"} Apr 28 19:19:14.496414 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:14.496367 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vffwj" podStartSLOduration=1.49635395 podStartE2EDuration="1.49635395s" podCreationTimestamp="2026-04-28 19:19:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:19:14.494706149 +0000 UTC m=+139.000422819" watchObservedRunningTime="2026-04-28 19:19:14.49635395 +0000 UTC m=+139.002070612" Apr 28 19:19:14.973451 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:14.973408 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/59674cfb-c293-4b6c-8e89-c681d39465e1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-9sq48\" (UID: \"59674cfb-c293-4b6c-8e89-c681d39465e1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9sq48" Apr 28 19:19:14.973642 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:14.973608 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 28 19:19:14.973704 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:14.973689 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59674cfb-c293-4b6c-8e89-c681d39465e1-samples-operator-tls podName:59674cfb-c293-4b6c-8e89-c681d39465e1 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:18.973669179 +0000 UTC m=+143.479385827 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/59674cfb-c293-4b6c-8e89-c681d39465e1-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-9sq48" (UID: "59674cfb-c293-4b6c-8e89-c681d39465e1") : secret "samples-operator-tls" not found Apr 28 19:19:15.074729 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:15.074683 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/977bb0b7-3623-4100-ba3a-1b9d24046162-metrics-certs\") pod \"router-default-d64467788-4mcrw\" (UID: \"977bb0b7-3623-4100-ba3a-1b9d24046162\") " pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:15.074729 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:15.074732 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/977bb0b7-3623-4100-ba3a-1b9d24046162-service-ca-bundle\") pod \"router-default-d64467788-4mcrw\" (UID: \"977bb0b7-3623-4100-ba3a-1b9d24046162\") " pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:15.074971 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:15.074853 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 28 19:19:15.074971 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:15.074912 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/977bb0b7-3623-4100-ba3a-1b9d24046162-service-ca-bundle podName:977bb0b7-3623-4100-ba3a-1b9d24046162 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:19.07489744 +0000 UTC m=+143.580614087 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/977bb0b7-3623-4100-ba3a-1b9d24046162-service-ca-bundle") pod "router-default-d64467788-4mcrw" (UID: "977bb0b7-3623-4100-ba3a-1b9d24046162") : configmap references non-existent config key: service-ca.crt Apr 28 19:19:15.074971 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:15.074929 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/977bb0b7-3623-4100-ba3a-1b9d24046162-metrics-certs podName:977bb0b7-3623-4100-ba3a-1b9d24046162 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:19.074922901 +0000 UTC m=+143.580639545 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/977bb0b7-3623-4100-ba3a-1b9d24046162-metrics-certs") pod "router-default-d64467788-4mcrw" (UID: "977bb0b7-3623-4100-ba3a-1b9d24046162") : secret "router-metrics-certs-default" not found Apr 28 19:19:15.175641 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:15.175600 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b167459-93b9-4e7b-bd66-94d693cab19e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-mj7v4\" (UID: \"9b167459-93b9-4e7b-bd66-94d693cab19e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-mj7v4" Apr 28 19:19:15.175819 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:15.175706 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 28 19:19:15.175819 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:15.175788 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b167459-93b9-4e7b-bd66-94d693cab19e-cluster-monitoring-operator-tls podName:9b167459-93b9-4e7b-bd66-94d693cab19e nodeName:}" failed. No retries permitted until 2026-04-28 19:19:19.175767496 +0000 UTC m=+143.681484143 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9b167459-93b9-4e7b-bd66-94d693cab19e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-mj7v4" (UID: "9b167459-93b9-4e7b-bd66-94d693cab19e") : secret "cluster-monitoring-operator-tls" not found Apr 28 19:19:15.571020 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:15.570986 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-ljnqn"] Apr 28 19:19:15.573286 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:15.573268 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ljnqn" Apr 28 19:19:15.575985 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:15.575960 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-mtmgn\"" Apr 28 19:19:15.576104 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:15.575996 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 28 19:19:15.577036 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:15.577018 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 28 19:19:15.603346 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:15.603319 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-ljnqn"] Apr 28 19:19:15.679910 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:15.679874 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l5cp\" (UniqueName: \"kubernetes.io/projected/d88080ac-e246-4a18-88af-b696d1f2fc08-kube-api-access-4l5cp\") pod \"migrator-74bb7799d9-ljnqn\" (UID: \"d88080ac-e246-4a18-88af-b696d1f2fc08\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ljnqn" Apr 28 19:19:15.780449 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:15.780398 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4l5cp\" (UniqueName: \"kubernetes.io/projected/d88080ac-e246-4a18-88af-b696d1f2fc08-kube-api-access-4l5cp\") pod \"migrator-74bb7799d9-ljnqn\" (UID: \"d88080ac-e246-4a18-88af-b696d1f2fc08\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ljnqn" Apr 28 19:19:15.790371 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:15.790352 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l5cp\" (UniqueName: \"kubernetes.io/projected/d88080ac-e246-4a18-88af-b696d1f2fc08-kube-api-access-4l5cp\") pod \"migrator-74bb7799d9-ljnqn\" (UID: \"d88080ac-e246-4a18-88af-b696d1f2fc08\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ljnqn" Apr 28 19:19:15.882992 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:15.882948 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ljnqn" Apr 28 19:19:15.998629 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:15.998599 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-ljnqn"] Apr 28 19:19:16.002023 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:19:16.001994 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd88080ac_e246_4a18_88af_b696d1f2fc08.slice/crio-dc23fe2121be229bcacdab4deec9e76ef817f3858412c97ef20868f33e69903f WatchSource:0}: Error finding container dc23fe2121be229bcacdab4deec9e76ef817f3858412c97ef20868f33e69903f: Status 404 returned error can't find the container with id dc23fe2121be229bcacdab4deec9e76ef817f3858412c97ef20868f33e69903f Apr 28 19:19:16.482420 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:16.482382 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tnqkx" event={"ID":"533b151b-982f-4582-9292-150aa20dc9df","Type":"ContainerStarted","Data":"8272e3b996e379968aa7d30f5add2db4d5575d9f32265f12c0c90daf6a5c517c"} Apr 28 19:19:16.483349 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:16.483325 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ljnqn" event={"ID":"d88080ac-e246-4a18-88af-b696d1f2fc08","Type":"ContainerStarted","Data":"dc23fe2121be229bcacdab4deec9e76ef817f3858412c97ef20868f33e69903f"} Apr 28 19:19:16.518984 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:16.518929 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tnqkx" podStartSLOduration=1.567523539 podStartE2EDuration="3.51891275s" podCreationTimestamp="2026-04-28 19:19:13 +0000 UTC" firstStartedPulling="2026-04-28 19:19:13.482030318 +0000 UTC m=+137.987746966" lastFinishedPulling="2026-04-28 19:19:15.433419529 +0000 UTC m=+139.939136177" observedRunningTime="2026-04-28 19:19:16.51718758 +0000 UTC m=+141.022904247" watchObservedRunningTime="2026-04-28 19:19:16.51891275 +0000 UTC m=+141.024629417" Apr 28 19:19:17.486691 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:17.486599 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ljnqn" event={"ID":"d88080ac-e246-4a18-88af-b696d1f2fc08","Type":"ContainerStarted","Data":"a5d0c7822f452f67a8fa2514fc92c15db4b4e8f5b740c4e32d10d4a289719bb8"} Apr 28 19:19:17.486691 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:17.486644 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ljnqn" event={"ID":"d88080ac-e246-4a18-88af-b696d1f2fc08","Type":"ContainerStarted","Data":"6589aaa6caf9647c7656dcc5d67e845b4d59e87c31b13656aa12acb58bb08d92"} Apr 28 19:19:17.508966 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:17.508916 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ljnqn" podStartSLOduration=1.471537864 podStartE2EDuration="2.508902101s" podCreationTimestamp="2026-04-28 19:19:15 +0000 UTC" firstStartedPulling="2026-04-28 19:19:16.003873005 +0000 UTC m=+140.509589648" lastFinishedPulling="2026-04-28 19:19:17.041237227 +0000 UTC m=+141.546953885" observedRunningTime="2026-04-28 19:19:17.508241258 +0000 UTC m=+142.013957925" watchObservedRunningTime="2026-04-28 19:19:17.508902101 +0000 UTC m=+142.014618834" Apr 28 19:19:18.413393 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:18.413363 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-gg56k_5b0e3ab7-8ea3-4aea-8300-4e7ecf70c550/dns-node-resolver/0.log" Apr 28 19:19:19.005524 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:19.005470 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/59674cfb-c293-4b6c-8e89-c681d39465e1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-9sq48\" (UID: \"59674cfb-c293-4b6c-8e89-c681d39465e1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9sq48" Apr 28 19:19:19.005954 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:19.005623 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 28 19:19:19.005954 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:19.005693 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59674cfb-c293-4b6c-8e89-c681d39465e1-samples-operator-tls podName:59674cfb-c293-4b6c-8e89-c681d39465e1 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:27.005677941 +0000 UTC m=+151.511394585 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/59674cfb-c293-4b6c-8e89-c681d39465e1-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-9sq48" (UID: "59674cfb-c293-4b6c-8e89-c681d39465e1") : secret "samples-operator-tls" not found Apr 28 19:19:19.106507 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:19.106458 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/977bb0b7-3623-4100-ba3a-1b9d24046162-metrics-certs\") pod \"router-default-d64467788-4mcrw\" (UID: \"977bb0b7-3623-4100-ba3a-1b9d24046162\") " pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:19.106507 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:19.106510 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/977bb0b7-3623-4100-ba3a-1b9d24046162-service-ca-bundle\") pod \"router-default-d64467788-4mcrw\" (UID: \"977bb0b7-3623-4100-ba3a-1b9d24046162\") " pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:19.106724 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:19.106596 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 28 19:19:19.106724 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:19.106664 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/977bb0b7-3623-4100-ba3a-1b9d24046162-metrics-certs podName:977bb0b7-3623-4100-ba3a-1b9d24046162 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:27.106646844 +0000 UTC m=+151.612363488 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/977bb0b7-3623-4100-ba3a-1b9d24046162-metrics-certs") pod "router-default-d64467788-4mcrw" (UID: "977bb0b7-3623-4100-ba3a-1b9d24046162") : secret "router-metrics-certs-default" not found Apr 28 19:19:19.106724 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:19.106678 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/977bb0b7-3623-4100-ba3a-1b9d24046162-service-ca-bundle podName:977bb0b7-3623-4100-ba3a-1b9d24046162 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:27.106671851 +0000 UTC m=+151.612388494 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/977bb0b7-3623-4100-ba3a-1b9d24046162-service-ca-bundle") pod "router-default-d64467788-4mcrw" (UID: "977bb0b7-3623-4100-ba3a-1b9d24046162") : configmap references non-existent config key: service-ca.crt Apr 28 19:19:19.207864 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:19.207813 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b167459-93b9-4e7b-bd66-94d693cab19e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-mj7v4\" (UID: \"9b167459-93b9-4e7b-bd66-94d693cab19e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-mj7v4" Apr 28 19:19:19.208037 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:19.207962 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 28 19:19:19.208037 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:19.208029 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b167459-93b9-4e7b-bd66-94d693cab19e-cluster-monitoring-operator-tls podName:9b167459-93b9-4e7b-bd66-94d693cab19e nodeName:}" failed. No retries permitted until 2026-04-28 19:19:27.208012837 +0000 UTC m=+151.713729485 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9b167459-93b9-4e7b-bd66-94d693cab19e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-mj7v4" (UID: "9b167459-93b9-4e7b-bd66-94d693cab19e") : secret "cluster-monitoring-operator-tls" not found Apr 28 19:19:19.428229 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:19.428200 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ttwth_784cb0c1-1c08-41f8-8c08-e92cdf0c70ce/node-ca/0.log" Apr 28 19:19:20.614988 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:20.614956 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-ljnqn_d88080ac-e246-4a18-88af-b696d1f2fc08/migrator/0.log" Apr 28 19:19:20.815794 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:20.815768 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-ljnqn_d88080ac-e246-4a18-88af-b696d1f2fc08/graceful-termination/0.log" Apr 28 19:19:27.070665 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:27.070613 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/59674cfb-c293-4b6c-8e89-c681d39465e1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-9sq48\" (UID: \"59674cfb-c293-4b6c-8e89-c681d39465e1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9sq48" Apr 28 19:19:27.072974 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:27.072952 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/59674cfb-c293-4b6c-8e89-c681d39465e1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-9sq48\" (UID: \"59674cfb-c293-4b6c-8e89-c681d39465e1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9sq48" Apr 28 19:19:27.083370 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:27.083348 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9sq48" Apr 28 19:19:27.172528 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:27.171804 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/977bb0b7-3623-4100-ba3a-1b9d24046162-metrics-certs\") pod \"router-default-d64467788-4mcrw\" (UID: \"977bb0b7-3623-4100-ba3a-1b9d24046162\") " pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:27.172528 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:27.171859 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/977bb0b7-3623-4100-ba3a-1b9d24046162-service-ca-bundle\") pod \"router-default-d64467788-4mcrw\" (UID: \"977bb0b7-3623-4100-ba3a-1b9d24046162\") " pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:27.172528 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:27.172096 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/977bb0b7-3623-4100-ba3a-1b9d24046162-service-ca-bundle podName:977bb0b7-3623-4100-ba3a-1b9d24046162 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:43.172077544 +0000 UTC m=+167.677794188 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/977bb0b7-3623-4100-ba3a-1b9d24046162-service-ca-bundle") pod "router-default-d64467788-4mcrw" (UID: "977bb0b7-3623-4100-ba3a-1b9d24046162") : configmap references non-existent config key: service-ca.crt Apr 28 19:19:27.174563 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:27.174539 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/977bb0b7-3623-4100-ba3a-1b9d24046162-metrics-certs\") pod \"router-default-d64467788-4mcrw\" (UID: \"977bb0b7-3623-4100-ba3a-1b9d24046162\") " pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:27.210145 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:27.210120 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9sq48"] Apr 28 19:19:27.272910 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:27.272881 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b167459-93b9-4e7b-bd66-94d693cab19e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-mj7v4\" (UID: \"9b167459-93b9-4e7b-bd66-94d693cab19e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-mj7v4" Apr 28 19:19:27.273035 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:27.273018 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 28 19:19:27.273091 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:27.273077 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b167459-93b9-4e7b-bd66-94d693cab19e-cluster-monitoring-operator-tls podName:9b167459-93b9-4e7b-bd66-94d693cab19e nodeName:}" failed. No retries permitted until 2026-04-28 19:19:43.273061961 +0000 UTC m=+167.778778610 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9b167459-93b9-4e7b-bd66-94d693cab19e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-mj7v4" (UID: "9b167459-93b9-4e7b-bd66-94d693cab19e") : secret "cluster-monitoring-operator-tls" not found Apr 28 19:19:27.513116 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:27.513084 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9sq48" event={"ID":"59674cfb-c293-4b6c-8e89-c681d39465e1","Type":"ContainerStarted","Data":"3e391248cb3df90401de61d81699eee67b18c637f1744f6c19897354eef5b779"} Apr 28 19:19:29.523136 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:29.523096 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9sq48" event={"ID":"59674cfb-c293-4b6c-8e89-c681d39465e1","Type":"ContainerStarted","Data":"252a56fec7bf181f3bab5e28c77e088ee2204cc9b3a1de29321c6a11b4dfc26b"} Apr 28 19:19:29.523532 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:29.523144 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9sq48" event={"ID":"59674cfb-c293-4b6c-8e89-c681d39465e1","Type":"ContainerStarted","Data":"b5f804646fcd292cce995c0a5e991fd28b6982ee01c8b6249346567af439b5da"} Apr 28 19:19:29.546728 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:29.546682 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-9sq48" podStartSLOduration=16.64190543 podStartE2EDuration="18.546666481s" podCreationTimestamp="2026-04-28 19:19:11 +0000 UTC" firstStartedPulling="2026-04-28 19:19:27.2505187 +0000 UTC m=+151.756235347" lastFinishedPulling="2026-04-28 19:19:29.15527975 +0000 UTC m=+153.660996398" observedRunningTime="2026-04-28 19:19:29.546152971 +0000 UTC m=+154.051869863" watchObservedRunningTime="2026-04-28 19:19:29.546666481 +0000 UTC m=+154.052383149" Apr 28 19:19:32.458251 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:32.458200 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-2jwqs" podUID="4d5f310c-a755-4af1-8570-335ac92bb8cf" Apr 28 19:19:32.475359 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:32.475322 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-w7t9f" podUID="4f0dd845-b66a-4d78-b7a3-811ca24028e4" Apr 28 19:19:32.530777 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:32.530752 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2jwqs" Apr 28 19:19:32.530911 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:32.530761 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-w7t9f" Apr 28 19:19:34.115869 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:19:34.115829 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-hndjc" podUID="cbe36bec-c099-4625-b8c9-eb67c281b442" Apr 28 19:19:36.896057 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:36.896021 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-9zfhp"] Apr 28 19:19:36.899134 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:36.899114 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9zfhp" Apr 28 19:19:36.904801 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:36.904778 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 28 19:19:36.906275 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:36.906256 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 28 19:19:36.907036 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:36.907018 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 28 19:19:36.914523 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:36.914508 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-8vs7f\"" Apr 28 19:19:36.937108 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:36.937082 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9zfhp"] Apr 28 19:19:36.945712 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:36.945696 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 28 19:19:36.949650 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:36.949627 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7b91d51c-a9f3-41ac-8b7e-09e04af9b26a-data-volume\") pod \"insights-runtime-extractor-9zfhp\" (UID: \"7b91d51c-a9f3-41ac-8b7e-09e04af9b26a\") " pod="openshift-insights/insights-runtime-extractor-9zfhp" Apr 28 19:19:36.949745 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:36.949718 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9z84\" (UniqueName: \"kubernetes.io/projected/7b91d51c-a9f3-41ac-8b7e-09e04af9b26a-kube-api-access-w9z84\") pod \"insights-runtime-extractor-9zfhp\" (UID: \"7b91d51c-a9f3-41ac-8b7e-09e04af9b26a\") " pod="openshift-insights/insights-runtime-extractor-9zfhp" Apr 28 19:19:36.949805 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:36.949765 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7b91d51c-a9f3-41ac-8b7e-09e04af9b26a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9zfhp\" (UID: \"7b91d51c-a9f3-41ac-8b7e-09e04af9b26a\") " pod="openshift-insights/insights-runtime-extractor-9zfhp" Apr 28 19:19:36.949858 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:36.949815 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7b91d51c-a9f3-41ac-8b7e-09e04af9b26a-crio-socket\") pod \"insights-runtime-extractor-9zfhp\" (UID: \"7b91d51c-a9f3-41ac-8b7e-09e04af9b26a\") " pod="openshift-insights/insights-runtime-extractor-9zfhp" Apr 28 19:19:36.949913 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:36.949892 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7b91d51c-a9f3-41ac-8b7e-09e04af9b26a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9zfhp\" (UID: \"7b91d51c-a9f3-41ac-8b7e-09e04af9b26a\") " pod="openshift-insights/insights-runtime-extractor-9zfhp" Apr 28 19:19:37.051236 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:37.051202 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7b91d51c-a9f3-41ac-8b7e-09e04af9b26a-data-volume\") pod \"insights-runtime-extractor-9zfhp\" (UID: \"7b91d51c-a9f3-41ac-8b7e-09e04af9b26a\") " pod="openshift-insights/insights-runtime-extractor-9zfhp" Apr 28 19:19:37.051395 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:37.051247 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9z84\" (UniqueName: \"kubernetes.io/projected/7b91d51c-a9f3-41ac-8b7e-09e04af9b26a-kube-api-access-w9z84\") pod \"insights-runtime-extractor-9zfhp\" (UID: \"7b91d51c-a9f3-41ac-8b7e-09e04af9b26a\") " pod="openshift-insights/insights-runtime-extractor-9zfhp" Apr 28 19:19:37.051395 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:37.051271 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7b91d51c-a9f3-41ac-8b7e-09e04af9b26a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9zfhp\" (UID: \"7b91d51c-a9f3-41ac-8b7e-09e04af9b26a\") " pod="openshift-insights/insights-runtime-extractor-9zfhp" Apr 28 19:19:37.051395 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:37.051295 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7b91d51c-a9f3-41ac-8b7e-09e04af9b26a-crio-socket\") pod \"insights-runtime-extractor-9zfhp\" (UID: \"7b91d51c-a9f3-41ac-8b7e-09e04af9b26a\") " pod="openshift-insights/insights-runtime-extractor-9zfhp" Apr 28 19:19:37.051562 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:37.051547 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7b91d51c-a9f3-41ac-8b7e-09e04af9b26a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9zfhp\" (UID: \"7b91d51c-a9f3-41ac-8b7e-09e04af9b26a\") " pod="openshift-insights/insights-runtime-extractor-9zfhp" Apr 28 19:19:37.051625 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:37.051605 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7b91d51c-a9f3-41ac-8b7e-09e04af9b26a-crio-socket\") pod \"insights-runtime-extractor-9zfhp\" (UID: \"7b91d51c-a9f3-41ac-8b7e-09e04af9b26a\") " pod="openshift-insights/insights-runtime-extractor-9zfhp" Apr 28 19:19:37.051664 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:37.051601 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7b91d51c-a9f3-41ac-8b7e-09e04af9b26a-data-volume\") pod \"insights-runtime-extractor-9zfhp\" (UID: \"7b91d51c-a9f3-41ac-8b7e-09e04af9b26a\") " pod="openshift-insights/insights-runtime-extractor-9zfhp" Apr 28 19:19:37.051849 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:37.051835 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7b91d51c-a9f3-41ac-8b7e-09e04af9b26a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9zfhp\" (UID: \"7b91d51c-a9f3-41ac-8b7e-09e04af9b26a\") " pod="openshift-insights/insights-runtime-extractor-9zfhp" Apr 28 19:19:37.053743 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:37.053723 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7b91d51c-a9f3-41ac-8b7e-09e04af9b26a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9zfhp\" (UID: \"7b91d51c-a9f3-41ac-8b7e-09e04af9b26a\") " pod="openshift-insights/insights-runtime-extractor-9zfhp" Apr 28 19:19:37.078782 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:37.078756 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9z84\" (UniqueName: \"kubernetes.io/projected/7b91d51c-a9f3-41ac-8b7e-09e04af9b26a-kube-api-access-w9z84\") pod \"insights-runtime-extractor-9zfhp\" (UID: \"7b91d51c-a9f3-41ac-8b7e-09e04af9b26a\") " pod="openshift-insights/insights-runtime-extractor-9zfhp" Apr 28 19:19:37.208184 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:37.208076 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9zfhp" Apr 28 19:19:37.353711 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:37.353660 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9zfhp"] Apr 28 19:19:37.355456 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:37.355436 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d5f310c-a755-4af1-8570-335ac92bb8cf-metrics-tls\") pod \"dns-default-2jwqs\" (UID: \"4d5f310c-a755-4af1-8570-335ac92bb8cf\") " pod="openshift-dns/dns-default-2jwqs" Apr 28 19:19:37.357728 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:19:37.357702 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b91d51c_a9f3_41ac_8b7e_09e04af9b26a.slice/crio-fc8ad7e73cde2e3f6890464ae772c2f30ee55dd2bc3025df7b7454d1fa5336fd WatchSource:0}: Error finding container fc8ad7e73cde2e3f6890464ae772c2f30ee55dd2bc3025df7b7454d1fa5336fd: Status 404 returned error can't find the container with id fc8ad7e73cde2e3f6890464ae772c2f30ee55dd2bc3025df7b7454d1fa5336fd Apr 28 19:19:37.357849 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:37.357756 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d5f310c-a755-4af1-8570-335ac92bb8cf-metrics-tls\") pod \"dns-default-2jwqs\" (UID: \"4d5f310c-a755-4af1-8570-335ac92bb8cf\") " pod="openshift-dns/dns-default-2jwqs" Apr 28 19:19:37.456110 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:37.456082 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f0dd845-b66a-4d78-b7a3-811ca24028e4-cert\") pod \"ingress-canary-w7t9f\" (UID: \"4f0dd845-b66a-4d78-b7a3-811ca24028e4\") " pod="openshift-ingress-canary/ingress-canary-w7t9f" Apr 28 19:19:37.458342 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:37.458288 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f0dd845-b66a-4d78-b7a3-811ca24028e4-cert\") pod \"ingress-canary-w7t9f\" (UID: \"4f0dd845-b66a-4d78-b7a3-811ca24028e4\") " pod="openshift-ingress-canary/ingress-canary-w7t9f" Apr 28 19:19:37.543752 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:37.543719 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9zfhp" event={"ID":"7b91d51c-a9f3-41ac-8b7e-09e04af9b26a","Type":"ContainerStarted","Data":"dd8dfd13436b7e8fbd7cdfb3062506cd6b339aad3c5ec9543da4a40c58c6adc8"} Apr 28 19:19:37.543752 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:37.543754 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9zfhp" event={"ID":"7b91d51c-a9f3-41ac-8b7e-09e04af9b26a","Type":"ContainerStarted","Data":"fc8ad7e73cde2e3f6890464ae772c2f30ee55dd2bc3025df7b7454d1fa5336fd"} Apr 28 19:19:37.634943 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:37.634917 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-524nx\"" Apr 28 19:19:37.635102 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:37.635065 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jtx2j\"" Apr 28 19:19:37.642554 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:37.642534 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-w7t9f" Apr 28 19:19:37.642659 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:37.642559 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2jwqs" Apr 28 19:19:37.782447 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:37.782344 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2jwqs"] Apr 28 19:19:37.785014 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:19:37.784983 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d5f310c_a755_4af1_8570_335ac92bb8cf.slice/crio-577b2947f1ff7cd2cc2629e08f79050c5ef82dbb57eda3a8ca5fbf70e486becf WatchSource:0}: Error finding container 577b2947f1ff7cd2cc2629e08f79050c5ef82dbb57eda3a8ca5fbf70e486becf: Status 404 returned error can't find the container with id 577b2947f1ff7cd2cc2629e08f79050c5ef82dbb57eda3a8ca5fbf70e486becf Apr 28 19:19:37.808561 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:37.808533 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-w7t9f"] Apr 28 19:19:37.885710 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:19:37.885672 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f0dd845_b66a_4d78_b7a3_811ca24028e4.slice/crio-966b1f5172ee8aa02553f6972d39bbbfdb5aca711678d1b437bac999b7a3c6c2 WatchSource:0}: Error finding container 966b1f5172ee8aa02553f6972d39bbbfdb5aca711678d1b437bac999b7a3c6c2: Status 404 returned error can't find the container with id 966b1f5172ee8aa02553f6972d39bbbfdb5aca711678d1b437bac999b7a3c6c2 Apr 28 19:19:38.185996 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:38.185962 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-8d77s"] Apr 28 19:19:38.187837 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:38.187821 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-8d77s" Apr 28 19:19:38.193035 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:38.193015 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-s7g8j\"" Apr 28 19:19:38.196782 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:38.196763 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 28 19:19:38.203312 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:38.203296 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 28 19:19:38.211154 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:38.211128 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-8d77s"] Apr 28 19:19:38.261928 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:38.261900 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxj5z\" (UniqueName: \"kubernetes.io/projected/4a51790b-71af-4495-bf67-814c27aeb63e-kube-api-access-nxj5z\") pod \"downloads-6bcc868b7-8d77s\" (UID: \"4a51790b-71af-4495-bf67-814c27aeb63e\") " pod="openshift-console/downloads-6bcc868b7-8d77s" Apr 28 19:19:38.362982 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:38.362835 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxj5z\" (UniqueName: \"kubernetes.io/projected/4a51790b-71af-4495-bf67-814c27aeb63e-kube-api-access-nxj5z\") pod \"downloads-6bcc868b7-8d77s\" (UID: \"4a51790b-71af-4495-bf67-814c27aeb63e\") " pod="openshift-console/downloads-6bcc868b7-8d77s" Apr 28 19:19:38.384509 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:38.384445 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxj5z\" (UniqueName: \"kubernetes.io/projected/4a51790b-71af-4495-bf67-814c27aeb63e-kube-api-access-nxj5z\") pod \"downloads-6bcc868b7-8d77s\" (UID: \"4a51790b-71af-4495-bf67-814c27aeb63e\") " pod="openshift-console/downloads-6bcc868b7-8d77s" Apr 28 19:19:38.496737 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:38.496656 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-8d77s" Apr 28 19:19:38.549551 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:38.549515 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9zfhp" event={"ID":"7b91d51c-a9f3-41ac-8b7e-09e04af9b26a","Type":"ContainerStarted","Data":"07322ff96d660174fbc5baa0718d50f5c0f4a683ee0646af9b4b95569cf85082"} Apr 28 19:19:38.551004 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:38.550963 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-w7t9f" event={"ID":"4f0dd845-b66a-4d78-b7a3-811ca24028e4","Type":"ContainerStarted","Data":"966b1f5172ee8aa02553f6972d39bbbfdb5aca711678d1b437bac999b7a3c6c2"} Apr 28 19:19:38.553518 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:38.553477 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2jwqs" event={"ID":"4d5f310c-a755-4af1-8570-335ac92bb8cf","Type":"ContainerStarted","Data":"577b2947f1ff7cd2cc2629e08f79050c5ef82dbb57eda3a8ca5fbf70e486becf"} Apr 28 19:19:38.682273 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:38.682221 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-8d77s"] Apr 28 19:19:38.686231 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:19:38.686201 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a51790b_71af_4495_bf67_814c27aeb63e.slice/crio-f90a41f523a08d52f1395f10fd4138f99e354f098b3eb873aec12a901088ea08 WatchSource:0}: Error finding container f90a41f523a08d52f1395f10fd4138f99e354f098b3eb873aec12a901088ea08: Status 404 returned error can't find the container with id f90a41f523a08d52f1395f10fd4138f99e354f098b3eb873aec12a901088ea08 Apr 28 19:19:39.557662 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:39.557608 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-8d77s" event={"ID":"4a51790b-71af-4495-bf67-814c27aeb63e","Type":"ContainerStarted","Data":"f90a41f523a08d52f1395f10fd4138f99e354f098b3eb873aec12a901088ea08"} Apr 28 19:19:40.566711 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:40.566676 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9zfhp" event={"ID":"7b91d51c-a9f3-41ac-8b7e-09e04af9b26a","Type":"ContainerStarted","Data":"22a7607eb8ad502017dabbbe3571be72750b206031f86eb92748f6f9d6eb0b14"} Apr 28 19:19:40.569508 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:40.569467 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-w7t9f" event={"ID":"4f0dd845-b66a-4d78-b7a3-811ca24028e4","Type":"ContainerStarted","Data":"b6b07f5789d065012c879d3f94772705a204e5900077c1c0d0ff0e91e4c7f050"} Apr 28 19:19:40.571427 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:40.571374 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2jwqs" event={"ID":"4d5f310c-a755-4af1-8570-335ac92bb8cf","Type":"ContainerStarted","Data":"2f625a18c8ea9eb02f4d894dd440e8be2e03108ce939dc067a358c335f4f9b13"} Apr 28 19:19:40.596270 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:40.596181 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-9zfhp" podStartSLOduration=1.7325564180000002 podStartE2EDuration="4.596151016s" podCreationTimestamp="2026-04-28 19:19:36 +0000 UTC" firstStartedPulling="2026-04-28 19:19:37.409617911 +0000 UTC m=+161.915334555" lastFinishedPulling="2026-04-28 19:19:40.273212502 +0000 UTC m=+164.778929153" observedRunningTime="2026-04-28 19:19:40.594399145 +0000 UTC m=+165.100115801" watchObservedRunningTime="2026-04-28 19:19:40.596151016 +0000 UTC m=+165.101867681" Apr 28 19:19:40.632105 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:40.632057 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-w7t9f" podStartSLOduration=129.243271078 podStartE2EDuration="2m11.632042603s" podCreationTimestamp="2026-04-28 19:17:29 +0000 UTC" firstStartedPulling="2026-04-28 19:19:37.887481331 +0000 UTC m=+162.393197976" lastFinishedPulling="2026-04-28 19:19:40.276252845 +0000 UTC m=+164.781969501" observedRunningTime="2026-04-28 19:19:40.63122655 +0000 UTC m=+165.136943215" watchObservedRunningTime="2026-04-28 19:19:40.632042603 +0000 UTC m=+165.137759266" Apr 28 19:19:41.576563 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:41.576522 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2jwqs" event={"ID":"4d5f310c-a755-4af1-8570-335ac92bb8cf","Type":"ContainerStarted","Data":"d70a00b44b9610acf0c3d695a043abb89cca5600379dd90dec56ebd09cfd5a1c"} Apr 28 19:19:41.600064 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:41.600010 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-2jwqs" podStartSLOduration=130.116559664 podStartE2EDuration="2m12.599994682s" podCreationTimestamp="2026-04-28 19:17:29 +0000 UTC" firstStartedPulling="2026-04-28 19:19:37.787196081 +0000 UTC m=+162.292912725" lastFinishedPulling="2026-04-28 19:19:40.27063109 +0000 UTC m=+164.776347743" observedRunningTime="2026-04-28 19:19:41.599347816 +0000 UTC m=+166.105064484" watchObservedRunningTime="2026-04-28 19:19:41.599994682 +0000 UTC m=+166.105711360" Apr 28 19:19:42.579878 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:42.579846 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-2jwqs" Apr 28 19:19:43.205423 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:43.205387 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/977bb0b7-3623-4100-ba3a-1b9d24046162-service-ca-bundle\") pod \"router-default-d64467788-4mcrw\" (UID: \"977bb0b7-3623-4100-ba3a-1b9d24046162\") " pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:43.206099 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:43.206073 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/977bb0b7-3623-4100-ba3a-1b9d24046162-service-ca-bundle\") pod \"router-default-d64467788-4mcrw\" (UID: \"977bb0b7-3623-4100-ba3a-1b9d24046162\") " pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:43.306299 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:43.306259 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b167459-93b9-4e7b-bd66-94d693cab19e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-mj7v4\" (UID: \"9b167459-93b9-4e7b-bd66-94d693cab19e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-mj7v4" Apr 28 19:19:43.308960 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:43.308921 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b167459-93b9-4e7b-bd66-94d693cab19e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-mj7v4\" (UID: \"9b167459-93b9-4e7b-bd66-94d693cab19e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-mj7v4" Apr 28 19:19:43.395482 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:43.395449 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:43.478355 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:43.478333 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-mj7v4" Apr 28 19:19:43.536665 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:43.536615 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-d64467788-4mcrw"] Apr 28 19:19:43.539762 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:19:43.539731 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod977bb0b7_3623_4100_ba3a_1b9d24046162.slice/crio-438842f4c627bbb06125c3314b1d0d47a092290422d2ef57549b4d41e6b4d753 WatchSource:0}: Error finding container 438842f4c627bbb06125c3314b1d0d47a092290422d2ef57549b4d41e6b4d753: Status 404 returned error can't find the container with id 438842f4c627bbb06125c3314b1d0d47a092290422d2ef57549b4d41e6b4d753 Apr 28 19:19:43.590807 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:43.590768 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-d64467788-4mcrw" event={"ID":"977bb0b7-3623-4100-ba3a-1b9d24046162","Type":"ContainerStarted","Data":"438842f4c627bbb06125c3314b1d0d47a092290422d2ef57549b4d41e6b4d753"} Apr 28 19:19:43.648735 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:43.648698 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-mj7v4"] Apr 28 19:19:43.652893 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:19:43.652859 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b167459_93b9_4e7b_bd66_94d693cab19e.slice/crio-f14d0073ed223f493c92f69982ebed3bd724502513c25abeb038833fe5a1f9a7 WatchSource:0}: Error finding container f14d0073ed223f493c92f69982ebed3bd724502513c25abeb038833fe5a1f9a7: Status 404 returned error can't find the container with id f14d0073ed223f493c92f69982ebed3bd724502513c25abeb038833fe5a1f9a7 Apr 28 19:19:44.595404 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:44.595289 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-d64467788-4mcrw" event={"ID":"977bb0b7-3623-4100-ba3a-1b9d24046162","Type":"ContainerStarted","Data":"ddf3d66660b86e95a5dfd1f69456ee0493da1a838ba23ab915bc146b5302c70b"} Apr 28 19:19:44.596596 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:44.596554 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-mj7v4" event={"ID":"9b167459-93b9-4e7b-bd66-94d693cab19e","Type":"ContainerStarted","Data":"f14d0073ed223f493c92f69982ebed3bd724502513c25abeb038833fe5a1f9a7"} Apr 28 19:19:45.396348 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:45.396292 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:45.399615 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:45.399586 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:45.426938 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:45.426879 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-d64467788-4mcrw" podStartSLOduration=34.42686147 podStartE2EDuration="34.42686147s" podCreationTimestamp="2026-04-28 19:19:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:19:44.620083351 +0000 UTC m=+169.125800017" watchObservedRunningTime="2026-04-28 19:19:45.42686147 +0000 UTC m=+169.932578136" Apr 28 19:19:45.600921 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:45.600892 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:45.602282 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:45.602258 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-d64467788-4mcrw" Apr 28 19:19:46.111178 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:46.110799 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:19:46.604241 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:46.604191 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-mj7v4" event={"ID":"9b167459-93b9-4e7b-bd66-94d693cab19e","Type":"ContainerStarted","Data":"223ead126df263bfd30ce729a016db6219189666ba2b4819310aea3f508d1c15"} Apr 28 19:19:46.622580 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:46.622527 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-mj7v4" podStartSLOduration=33.485361391 podStartE2EDuration="35.622514215s" podCreationTimestamp="2026-04-28 19:19:11 +0000 UTC" firstStartedPulling="2026-04-28 19:19:43.655239531 +0000 UTC m=+168.160956181" lastFinishedPulling="2026-04-28 19:19:45.792392348 +0000 UTC m=+170.298109005" observedRunningTime="2026-04-28 19:19:46.622136195 +0000 UTC m=+171.127852862" watchObservedRunningTime="2026-04-28 19:19:46.622514215 +0000 UTC m=+171.128230881" Apr 28 19:19:47.515479 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.515442 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-86bb844cdf-tq25p"] Apr 28 19:19:47.517824 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.517796 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86bb844cdf-tq25p" Apr 28 19:19:47.531384 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.531357 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 28 19:19:47.531505 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.531386 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 28 19:19:47.532054 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.532032 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 28 19:19:47.532640 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.532622 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 28 19:19:47.535513 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.535490 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 28 19:19:47.536968 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.536949 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-5nxrd\"" Apr 28 19:19:47.547110 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.546839 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86bb844cdf-tq25p"] Apr 28 19:19:47.643817 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.643778 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7effa71b-c418-4208-b242-7ebd04c719b2-console-config\") pod \"console-86bb844cdf-tq25p\" (UID: \"7effa71b-c418-4208-b242-7ebd04c719b2\") " pod="openshift-console/console-86bb844cdf-tq25p" Apr 28 19:19:47.644397 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.643830 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7effa71b-c418-4208-b242-7ebd04c719b2-service-ca\") pod \"console-86bb844cdf-tq25p\" (UID: \"7effa71b-c418-4208-b242-7ebd04c719b2\") " pod="openshift-console/console-86bb844cdf-tq25p" Apr 28 19:19:47.644397 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.643863 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7effa71b-c418-4208-b242-7ebd04c719b2-oauth-serving-cert\") pod \"console-86bb844cdf-tq25p\" (UID: \"7effa71b-c418-4208-b242-7ebd04c719b2\") " pod="openshift-console/console-86bb844cdf-tq25p" Apr 28 19:19:47.644397 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.643892 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7effa71b-c418-4208-b242-7ebd04c719b2-console-serving-cert\") pod \"console-86bb844cdf-tq25p\" (UID: \"7effa71b-c418-4208-b242-7ebd04c719b2\") " pod="openshift-console/console-86bb844cdf-tq25p" Apr 28 19:19:47.644397 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.643969 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69wn6\" (UniqueName: \"kubernetes.io/projected/7effa71b-c418-4208-b242-7ebd04c719b2-kube-api-access-69wn6\") pod \"console-86bb844cdf-tq25p\" (UID: \"7effa71b-c418-4208-b242-7ebd04c719b2\") " pod="openshift-console/console-86bb844cdf-tq25p" Apr 28 19:19:47.644397 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.644029 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7effa71b-c418-4208-b242-7ebd04c719b2-console-oauth-config\") pod \"console-86bb844cdf-tq25p\" (UID: \"7effa71b-c418-4208-b242-7ebd04c719b2\") " pod="openshift-console/console-86bb844cdf-tq25p" Apr 28 19:19:47.745081 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.745046 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7effa71b-c418-4208-b242-7ebd04c719b2-oauth-serving-cert\") pod \"console-86bb844cdf-tq25p\" (UID: \"7effa71b-c418-4208-b242-7ebd04c719b2\") " pod="openshift-console/console-86bb844cdf-tq25p" Apr 28 19:19:47.745283 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.745112 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7effa71b-c418-4208-b242-7ebd04c719b2-console-serving-cert\") pod \"console-86bb844cdf-tq25p\" (UID: \"7effa71b-c418-4208-b242-7ebd04c719b2\") " pod="openshift-console/console-86bb844cdf-tq25p" Apr 28 19:19:47.745283 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.745143 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69wn6\" (UniqueName: \"kubernetes.io/projected/7effa71b-c418-4208-b242-7ebd04c719b2-kube-api-access-69wn6\") pod \"console-86bb844cdf-tq25p\" (UID: \"7effa71b-c418-4208-b242-7ebd04c719b2\") " pod="openshift-console/console-86bb844cdf-tq25p" Apr 28 19:19:47.745393 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.745359 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7effa71b-c418-4208-b242-7ebd04c719b2-console-oauth-config\") pod \"console-86bb844cdf-tq25p\" (UID: \"7effa71b-c418-4208-b242-7ebd04c719b2\") " pod="openshift-console/console-86bb844cdf-tq25p" Apr 28 19:19:47.745753 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.745536 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7effa71b-c418-4208-b242-7ebd04c719b2-console-config\") pod \"console-86bb844cdf-tq25p\" (UID: \"7effa71b-c418-4208-b242-7ebd04c719b2\") " pod="openshift-console/console-86bb844cdf-tq25p" Apr 28 19:19:47.745753 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.745620 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7effa71b-c418-4208-b242-7ebd04c719b2-service-ca\") pod \"console-86bb844cdf-tq25p\" (UID: \"7effa71b-c418-4208-b242-7ebd04c719b2\") " pod="openshift-console/console-86bb844cdf-tq25p" Apr 28 19:19:47.746385 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.746326 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7effa71b-c418-4208-b242-7ebd04c719b2-service-ca\") pod \"console-86bb844cdf-tq25p\" (UID: \"7effa71b-c418-4208-b242-7ebd04c719b2\") " pod="openshift-console/console-86bb844cdf-tq25p" Apr 28 19:19:47.746385 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.746326 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7effa71b-c418-4208-b242-7ebd04c719b2-console-config\") pod \"console-86bb844cdf-tq25p\" (UID: \"7effa71b-c418-4208-b242-7ebd04c719b2\") " pod="openshift-console/console-86bb844cdf-tq25p" Apr 28 19:19:47.746551 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.746490 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7effa71b-c418-4208-b242-7ebd04c719b2-oauth-serving-cert\") pod \"console-86bb844cdf-tq25p\" (UID: \"7effa71b-c418-4208-b242-7ebd04c719b2\") " pod="openshift-console/console-86bb844cdf-tq25p" Apr 28 19:19:47.748027 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.748006 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7effa71b-c418-4208-b242-7ebd04c719b2-console-oauth-config\") pod \"console-86bb844cdf-tq25p\" (UID: \"7effa71b-c418-4208-b242-7ebd04c719b2\") " pod="openshift-console/console-86bb844cdf-tq25p" Apr 28 19:19:47.748261 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.748240 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7effa71b-c418-4208-b242-7ebd04c719b2-console-serving-cert\") pod \"console-86bb844cdf-tq25p\" (UID: \"7effa71b-c418-4208-b242-7ebd04c719b2\") " pod="openshift-console/console-86bb844cdf-tq25p" Apr 28 19:19:47.757926 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.757905 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69wn6\" (UniqueName: \"kubernetes.io/projected/7effa71b-c418-4208-b242-7ebd04c719b2-kube-api-access-69wn6\") pod \"console-86bb844cdf-tq25p\" (UID: \"7effa71b-c418-4208-b242-7ebd04c719b2\") " pod="openshift-console/console-86bb844cdf-tq25p" Apr 28 19:19:47.828899 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.828822 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86bb844cdf-tq25p" Apr 28 19:19:47.964459 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:47.964425 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86bb844cdf-tq25p"] Apr 28 19:19:47.967908 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:19:47.967875 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7effa71b_c418_4208_b242_7ebd04c719b2.slice/crio-91415b4405447403696c5e344dff15f2c7d14d2d330f816220d1fce23462a7fd WatchSource:0}: Error finding container 91415b4405447403696c5e344dff15f2c7d14d2d330f816220d1fce23462a7fd: Status 404 returned error can't find the container with id 91415b4405447403696c5e344dff15f2c7d14d2d330f816220d1fce23462a7fd Apr 28 19:19:48.611481 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:48.611439 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86bb844cdf-tq25p" event={"ID":"7effa71b-c418-4208-b242-7ebd04c719b2","Type":"ContainerStarted","Data":"91415b4405447403696c5e344dff15f2c7d14d2d330f816220d1fce23462a7fd"} Apr 28 19:19:49.412855 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:49.412818 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-kdztz"] Apr 28 19:19:49.416120 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:49.415316 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-kdztz" Apr 28 19:19:49.438005 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:49.437963 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-5qq9f\"" Apr 28 19:19:49.438433 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:49.437896 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 28 19:19:49.439079 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:49.439032 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 28 19:19:49.439759 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:49.439459 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 28 19:19:49.457781 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:49.457733 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-kdztz"] Apr 28 19:19:49.463337 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:49.463299 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2z7f\" (UniqueName: \"kubernetes.io/projected/14945d18-ba40-4efc-9f03-887d459daa01-kube-api-access-j2z7f\") pod \"prometheus-operator-5676c8c784-kdztz\" (UID: \"14945d18-ba40-4efc-9f03-887d459daa01\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kdztz" Apr 28 19:19:49.463481 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:49.463408 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/14945d18-ba40-4efc-9f03-887d459daa01-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-kdztz\" (UID: \"14945d18-ba40-4efc-9f03-887d459daa01\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kdztz" Apr 28 19:19:49.463602 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:49.463477 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14945d18-ba40-4efc-9f03-887d459daa01-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-kdztz\" (UID: \"14945d18-ba40-4efc-9f03-887d459daa01\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kdztz" Apr 28 19:19:49.463602 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:49.463544 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/14945d18-ba40-4efc-9f03-887d459daa01-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-kdztz\" (UID: \"14945d18-ba40-4efc-9f03-887d459daa01\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kdztz" Apr 28 19:19:49.564133 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:49.564095 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2z7f\" (UniqueName: \"kubernetes.io/projected/14945d18-ba40-4efc-9f03-887d459daa01-kube-api-access-j2z7f\") pod \"prometheus-operator-5676c8c784-kdztz\" (UID: \"14945d18-ba40-4efc-9f03-887d459daa01\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kdztz" Apr 28 19:19:49.564344 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:49.564155 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/14945d18-ba40-4efc-9f03-887d459daa01-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-kdztz\" (UID: \"14945d18-ba40-4efc-9f03-887d459daa01\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kdztz" Apr 28 19:19:49.564344 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:49.564207 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14945d18-ba40-4efc-9f03-887d459daa01-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-kdztz\" (UID: \"14945d18-ba40-4efc-9f03-887d459daa01\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kdztz" Apr 28 19:19:49.564599 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:49.564572 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/14945d18-ba40-4efc-9f03-887d459daa01-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-kdztz\" (UID: \"14945d18-ba40-4efc-9f03-887d459daa01\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kdztz" Apr 28 19:19:49.565108 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:49.565058 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14945d18-ba40-4efc-9f03-887d459daa01-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-kdztz\" (UID: \"14945d18-ba40-4efc-9f03-887d459daa01\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kdztz" Apr 28 19:19:49.567297 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:49.567254 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/14945d18-ba40-4efc-9f03-887d459daa01-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-kdztz\" (UID: \"14945d18-ba40-4efc-9f03-887d459daa01\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kdztz" Apr 28 19:19:49.567642 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:49.567620 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/14945d18-ba40-4efc-9f03-887d459daa01-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-kdztz\" (UID: \"14945d18-ba40-4efc-9f03-887d459daa01\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kdztz" Apr 28 19:19:49.579356 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:49.579308 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2z7f\" (UniqueName: \"kubernetes.io/projected/14945d18-ba40-4efc-9f03-887d459daa01-kube-api-access-j2z7f\") pod \"prometheus-operator-5676c8c784-kdztz\" (UID: \"14945d18-ba40-4efc-9f03-887d459daa01\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kdztz" Apr 28 19:19:49.727648 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:49.727559 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-kdztz" Apr 28 19:19:52.593913 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:52.593881 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-2jwqs" Apr 28 19:19:56.384266 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:56.384229 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-kdztz"] Apr 28 19:19:56.387666 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:19:56.387402 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14945d18_ba40_4efc_9f03_887d459daa01.slice/crio-a27a1d47d636d8010f0441b9e0b60747e9c61575d697aa7c6b0c15ba96e84fb2 WatchSource:0}: Error finding container a27a1d47d636d8010f0441b9e0b60747e9c61575d697aa7c6b0c15ba96e84fb2: Status 404 returned error can't find the container with id a27a1d47d636d8010f0441b9e0b60747e9c61575d697aa7c6b0c15ba96e84fb2 Apr 28 19:19:56.634470 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:56.634428 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-8d77s" event={"ID":"4a51790b-71af-4495-bf67-814c27aeb63e","Type":"ContainerStarted","Data":"c8f2bfd66ee09286cecabbd01ce3f7925271328fc901dd5510467f754e292463"} Apr 28 19:19:56.634719 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:56.634695 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-8d77s" Apr 28 19:19:56.636189 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:56.636013 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-kdztz" event={"ID":"14945d18-ba40-4efc-9f03-887d459daa01","Type":"ContainerStarted","Data":"a27a1d47d636d8010f0441b9e0b60747e9c61575d697aa7c6b0c15ba96e84fb2"} Apr 28 19:19:56.637729 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:56.637701 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86bb844cdf-tq25p" event={"ID":"7effa71b-c418-4208-b242-7ebd04c719b2","Type":"ContainerStarted","Data":"3eaf7d438f4ad129f896c370249bfb648ac85cac0e45be5b308d3dc648a429f0"} Apr 28 19:19:56.658065 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:56.658016 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-8d77s" podStartSLOduration=1.069359167 podStartE2EDuration="18.658002978s" podCreationTimestamp="2026-04-28 19:19:38 +0000 UTC" firstStartedPulling="2026-04-28 19:19:38.691207011 +0000 UTC m=+163.196923655" lastFinishedPulling="2026-04-28 19:19:56.279850819 +0000 UTC m=+180.785567466" observedRunningTime="2026-04-28 19:19:56.65746806 +0000 UTC m=+181.163184726" watchObservedRunningTime="2026-04-28 19:19:56.658002978 +0000 UTC m=+181.163719645" Apr 28 19:19:56.658787 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:56.658763 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-8d77s" Apr 28 19:19:56.681857 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:56.681806 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-86bb844cdf-tq25p" podStartSLOduration=1.403700023 podStartE2EDuration="9.681791791s" podCreationTimestamp="2026-04-28 19:19:47 +0000 UTC" firstStartedPulling="2026-04-28 19:19:47.970109413 +0000 UTC m=+172.475826059" lastFinishedPulling="2026-04-28 19:19:56.248201171 +0000 UTC m=+180.753917827" observedRunningTime="2026-04-28 19:19:56.680470123 +0000 UTC m=+181.186186802" watchObservedRunningTime="2026-04-28 19:19:56.681791791 +0000 UTC m=+181.187508458" Apr 28 19:19:57.830063 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:57.829989 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-86bb844cdf-tq25p" Apr 28 19:19:57.830063 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:57.830082 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-86bb844cdf-tq25p" Apr 28 19:19:57.836279 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:57.836255 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-86bb844cdf-tq25p" Apr 28 19:19:58.648826 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:58.648795 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-86bb844cdf-tq25p" Apr 28 19:19:59.649320 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:59.649274 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-kdztz" event={"ID":"14945d18-ba40-4efc-9f03-887d459daa01","Type":"ContainerStarted","Data":"8fe63305629f75011420e84d672f8fe78130ec510877ff2413a5768829f84322"} Apr 28 19:19:59.649320 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:59.649324 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-kdztz" event={"ID":"14945d18-ba40-4efc-9f03-887d459daa01","Type":"ContainerStarted","Data":"cd14484e8f8c347aba001503226cda5da3c860a0bd330ba91c0f9d2890e64a9e"} Apr 28 19:19:59.689828 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:19:59.689771 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-kdztz" podStartSLOduration=8.170454375 podStartE2EDuration="10.68975504s" podCreationTimestamp="2026-04-28 19:19:49 +0000 UTC" firstStartedPulling="2026-04-28 19:19:56.390792326 +0000 UTC m=+180.896508974" lastFinishedPulling="2026-04-28 19:19:58.910092979 +0000 UTC m=+183.415809639" observedRunningTime="2026-04-28 19:19:59.688685651 +0000 UTC m=+184.194402316" watchObservedRunningTime="2026-04-28 19:19:59.68975504 +0000 UTC m=+184.195471707" Apr 28 19:20:00.566061 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:00.566028 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86bb844cdf-tq25p"] Apr 28 19:20:01.801563 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.801524 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-62xq8"] Apr 28 19:20:01.836430 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.836383 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-dg759"] Apr 28 19:20:01.836605 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.836579 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-62xq8" Apr 28 19:20:01.839630 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.839607 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 28 19:20:01.839746 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.839711 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 28 19:20:01.839936 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.839913 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 28 19:20:01.840067 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.839992 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-6n79q\"" Apr 28 19:20:01.857506 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.857484 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-62xq8"] Apr 28 19:20:01.857647 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.857628 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:01.862097 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.862075 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-srbh6\"" Apr 28 19:20:01.862269 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.862246 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 28 19:20:01.862373 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.862252 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 28 19:20:01.862373 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.862365 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 28 19:20:01.966897 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.966864 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ed37b5e-0c0e-4891-be01-307728d47ad3-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-62xq8\" (UID: \"2ed37b5e-0c0e-4891-be01-307728d47ad3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-62xq8" Apr 28 19:20:01.967049 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.966907 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2ed37b5e-0c0e-4891-be01-307728d47ad3-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-62xq8\" (UID: \"2ed37b5e-0c0e-4891-be01-307728d47ad3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-62xq8" Apr 28 19:20:01.967049 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.966925 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-sys\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:01.967049 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.966988 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-node-exporter-textfile\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:01.967049 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.967039 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-root\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:01.967285 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.967072 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2ed37b5e-0c0e-4891-be01-307728d47ad3-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-62xq8\" (UID: \"2ed37b5e-0c0e-4891-be01-307728d47ad3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-62xq8" Apr 28 19:20:01.967285 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.967125 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2ed37b5e-0c0e-4891-be01-307728d47ad3-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-62xq8\" (UID: \"2ed37b5e-0c0e-4891-be01-307728d47ad3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-62xq8" Apr 28 19:20:01.967285 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.967199 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f949\" (UniqueName: \"kubernetes.io/projected/2ed37b5e-0c0e-4891-be01-307728d47ad3-kube-api-access-8f949\") pod \"kube-state-metrics-69db897b98-62xq8\" (UID: \"2ed37b5e-0c0e-4891-be01-307728d47ad3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-62xq8" Apr 28 19:20:01.967285 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.967233 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-node-exporter-accelerators-collector-config\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:01.967445 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.967301 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2ed37b5e-0c0e-4891-be01-307728d47ad3-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-62xq8\" (UID: \"2ed37b5e-0c0e-4891-be01-307728d47ad3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-62xq8" Apr 28 19:20:01.967445 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.967339 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:01.967445 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.967392 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-node-exporter-wtmp\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:01.967445 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.967422 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-metrics-client-ca\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:01.967610 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.967468 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29gw2\" (UniqueName: \"kubernetes.io/projected/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-kube-api-access-29gw2\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:01.967610 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:01.967509 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-node-exporter-tls\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:02.068422 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.068350 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-node-exporter-wtmp\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:02.068422 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.068399 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-metrics-client-ca\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:02.068660 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.068433 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29gw2\" (UniqueName: \"kubernetes.io/projected/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-kube-api-access-29gw2\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:02.068660 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.068546 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-node-exporter-tls\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:02.068660 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.068571 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-node-exporter-wtmp\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:02.068660 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.068596 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ed37b5e-0c0e-4891-be01-307728d47ad3-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-62xq8\" (UID: \"2ed37b5e-0c0e-4891-be01-307728d47ad3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-62xq8" Apr 28 19:20:02.068660 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.068631 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2ed37b5e-0c0e-4891-be01-307728d47ad3-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-62xq8\" (UID: \"2ed37b5e-0c0e-4891-be01-307728d47ad3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-62xq8" Apr 28 19:20:02.068894 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.068661 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-sys\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:02.068894 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:20:02.068681 2565 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 28 19:20:02.068894 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:20:02.068745 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-node-exporter-tls podName:35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b nodeName:}" failed. No retries permitted until 2026-04-28 19:20:02.568724641 +0000 UTC m=+187.074441301 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-node-exporter-tls") pod "node-exporter-dg759" (UID: "35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b") : secret "node-exporter-tls" not found Apr 28 19:20:02.068894 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.068744 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-sys\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:02.068894 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.068688 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-node-exporter-textfile\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:02.068894 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.068820 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-root\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:02.068894 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.068854 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2ed37b5e-0c0e-4891-be01-307728d47ad3-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-62xq8\" (UID: \"2ed37b5e-0c0e-4891-be01-307728d47ad3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-62xq8" Apr 28 19:20:02.068894 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.068886 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2ed37b5e-0c0e-4891-be01-307728d47ad3-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-62xq8\" (UID: \"2ed37b5e-0c0e-4891-be01-307728d47ad3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-62xq8" Apr 28 19:20:02.069319 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.068903 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-root\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:02.069319 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.068919 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8f949\" (UniqueName: \"kubernetes.io/projected/2ed37b5e-0c0e-4891-be01-307728d47ad3-kube-api-access-8f949\") pod \"kube-state-metrics-69db897b98-62xq8\" (UID: \"2ed37b5e-0c0e-4891-be01-307728d47ad3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-62xq8" Apr 28 19:20:02.069319 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.068954 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-node-exporter-accelerators-collector-config\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:02.069319 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.069008 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2ed37b5e-0c0e-4891-be01-307728d47ad3-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-62xq8\" (UID: \"2ed37b5e-0c0e-4891-be01-307728d47ad3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-62xq8" Apr 28 19:20:02.069319 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.069037 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:02.069319 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.069112 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-node-exporter-textfile\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:02.069319 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.069201 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-metrics-client-ca\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:02.069653 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.069399 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2ed37b5e-0c0e-4891-be01-307728d47ad3-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-62xq8\" (UID: \"2ed37b5e-0c0e-4891-be01-307728d47ad3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-62xq8" Apr 28 19:20:02.069653 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.069604 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2ed37b5e-0c0e-4891-be01-307728d47ad3-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-62xq8\" (UID: \"2ed37b5e-0c0e-4891-be01-307728d47ad3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-62xq8" Apr 28 19:20:02.069653 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.069623 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-node-exporter-accelerators-collector-config\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:02.069812 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.069698 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2ed37b5e-0c0e-4891-be01-307728d47ad3-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-62xq8\" (UID: \"2ed37b5e-0c0e-4891-be01-307728d47ad3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-62xq8" Apr 28 19:20:02.071592 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.071560 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:02.071702 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.071605 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2ed37b5e-0c0e-4891-be01-307728d47ad3-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-62xq8\" (UID: \"2ed37b5e-0c0e-4891-be01-307728d47ad3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-62xq8" Apr 28 19:20:02.071702 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.071618 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ed37b5e-0c0e-4891-be01-307728d47ad3-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-62xq8\" (UID: \"2ed37b5e-0c0e-4891-be01-307728d47ad3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-62xq8" Apr 28 19:20:02.080689 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.080664 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-29gw2\" (UniqueName: \"kubernetes.io/projected/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-kube-api-access-29gw2\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:02.081786 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.081764 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f949\" (UniqueName: \"kubernetes.io/projected/2ed37b5e-0c0e-4891-be01-307728d47ad3-kube-api-access-8f949\") pod \"kube-state-metrics-69db897b98-62xq8\" (UID: \"2ed37b5e-0c0e-4891-be01-307728d47ad3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-62xq8" Apr 28 19:20:02.147567 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.147532 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-62xq8" Apr 28 19:20:02.286133 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.286102 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-62xq8"] Apr 28 19:20:02.290096 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:20:02.290067 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ed37b5e_0c0e_4891_be01_307728d47ad3.slice/crio-85c2ff18b71facf24fd1725189362e5a1adf902681ba86e9d9c450c42c87bbab WatchSource:0}: Error finding container 85c2ff18b71facf24fd1725189362e5a1adf902681ba86e9d9c450c42c87bbab: Status 404 returned error can't find the container with id 85c2ff18b71facf24fd1725189362e5a1adf902681ba86e9d9c450c42c87bbab Apr 28 19:20:02.572797 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.572759 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-node-exporter-tls\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:02.575633 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.575608 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b-node-exporter-tls\") pod \"node-exporter-dg759\" (UID: \"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b\") " pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:02.659517 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.659480 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-62xq8" event={"ID":"2ed37b5e-0c0e-4891-be01-307728d47ad3","Type":"ContainerStarted","Data":"85c2ff18b71facf24fd1725189362e5a1adf902681ba86e9d9c450c42c87bbab"} Apr 28 19:20:02.768629 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:02.768594 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dg759" Apr 28 19:20:02.778869 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:20:02.778827 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35d30ac3_c6ef_47c7_9d24_bb4f9f7faa0b.slice/crio-06ac01a93b3438d96789f9095470265f66016364ef28a18674d7916f1e78e405 WatchSource:0}: Error finding container 06ac01a93b3438d96789f9095470265f66016364ef28a18674d7916f1e78e405: Status 404 returned error can't find the container with id 06ac01a93b3438d96789f9095470265f66016364ef28a18674d7916f1e78e405 Apr 28 19:20:03.665645 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:03.664387 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dg759" event={"ID":"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b","Type":"ContainerStarted","Data":"06ac01a93b3438d96789f9095470265f66016364ef28a18674d7916f1e78e405"} Apr 28 19:20:04.669944 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:04.669904 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-62xq8" event={"ID":"2ed37b5e-0c0e-4891-be01-307728d47ad3","Type":"ContainerStarted","Data":"6c9c2ed73ea90c6c697a3070dbb4a0e4507f7537345fbe0b9df8a199502e04f1"} Apr 28 19:20:04.670369 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:04.669954 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-62xq8" event={"ID":"2ed37b5e-0c0e-4891-be01-307728d47ad3","Type":"ContainerStarted","Data":"92ce5c9ad70a3fab242f89ae55778e2a76d92ef71d4d808ac65f8f39c745ad05"} Apr 28 19:20:04.671489 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:04.671456 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dg759" event={"ID":"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b","Type":"ContainerStarted","Data":"8c56e724c10f1d462f07f977e2c43ef9e462c07ee1e86ccd02b9a3738b832bc1"} Apr 28 19:20:05.676462 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:05.676425 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-62xq8" event={"ID":"2ed37b5e-0c0e-4891-be01-307728d47ad3","Type":"ContainerStarted","Data":"097151e3cf4e7041963a12cb9b0d65e52583faaef46b713262bb0513d533f4cb"} Apr 28 19:20:05.677869 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:05.677843 2565 generic.go:358] "Generic (PLEG): container finished" podID="35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b" containerID="8c56e724c10f1d462f07f977e2c43ef9e462c07ee1e86ccd02b9a3738b832bc1" exitCode=0 Apr 28 19:20:05.677992 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:05.677913 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dg759" event={"ID":"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b","Type":"ContainerDied","Data":"8c56e724c10f1d462f07f977e2c43ef9e462c07ee1e86ccd02b9a3738b832bc1"} Apr 28 19:20:05.712745 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:05.712694 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-62xq8" podStartSLOduration=2.604858689 podStartE2EDuration="4.712677011s" podCreationTimestamp="2026-04-28 19:20:01 +0000 UTC" firstStartedPulling="2026-04-28 19:20:02.292366304 +0000 UTC m=+186.798082949" lastFinishedPulling="2026-04-28 19:20:04.400184622 +0000 UTC m=+188.905901271" observedRunningTime="2026-04-28 19:20:05.710678474 +0000 UTC m=+190.216395141" watchObservedRunningTime="2026-04-28 19:20:05.712677011 +0000 UTC m=+190.218393677" Apr 28 19:20:06.682314 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:06.682277 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dg759" event={"ID":"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b","Type":"ContainerStarted","Data":"fc33ad61218a9a9c0c9892df9eed2b8cfc1072f8789ce17450f9658838a8efb3"} Apr 28 19:20:06.682314 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:06.682319 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dg759" event={"ID":"35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b","Type":"ContainerStarted","Data":"d74e233dc25093facbf2b2555ab4c887b33bed06874755b6b319299705941073"} Apr 28 19:20:06.707572 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:06.707528 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-dg759" podStartSLOduration=4.086416254 podStartE2EDuration="5.707512112s" podCreationTimestamp="2026-04-28 19:20:01 +0000 UTC" firstStartedPulling="2026-04-28 19:20:02.780985993 +0000 UTC m=+187.286702638" lastFinishedPulling="2026-04-28 19:20:04.402081851 +0000 UTC m=+188.907798496" observedRunningTime="2026-04-28 19:20:06.707087884 +0000 UTC m=+191.212804551" watchObservedRunningTime="2026-04-28 19:20:06.707512112 +0000 UTC m=+191.213228836" Apr 28 19:20:07.033726 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.033690 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-65b7d86768-x95lb"] Apr 28 19:20:07.065709 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.065684 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-65b7d86768-x95lb"] Apr 28 19:20:07.065853 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.065825 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" Apr 28 19:20:07.069720 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.069691 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 28 19:20:07.069909 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.069719 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 28 19:20:07.070002 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.069991 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 28 19:20:07.070082 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.069998 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 28 19:20:07.070151 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.070120 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-v4vtb\"" Apr 28 19:20:07.074519 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.074503 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 28 19:20:07.076694 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.076669 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 28 19:20:07.215342 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.215305 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee9aa6c0-9937-4be5-8ccb-f718ea8400e8-telemeter-trusted-ca-bundle\") pod \"telemeter-client-65b7d86768-x95lb\" (UID: \"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8\") " pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" Apr 28 19:20:07.215530 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.215407 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ee9aa6c0-9937-4be5-8ccb-f718ea8400e8-secret-telemeter-client\") pod \"telemeter-client-65b7d86768-x95lb\" (UID: \"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8\") " pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" Apr 28 19:20:07.215530 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.215439 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ee9aa6c0-9937-4be5-8ccb-f718ea8400e8-metrics-client-ca\") pod \"telemeter-client-65b7d86768-x95lb\" (UID: \"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8\") " pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" Apr 28 19:20:07.215530 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.215504 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee9aa6c0-9937-4be5-8ccb-f718ea8400e8-serving-certs-ca-bundle\") pod \"telemeter-client-65b7d86768-x95lb\" (UID: \"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8\") " pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" Apr 28 19:20:07.215697 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.215541 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpcv7\" (UniqueName: \"kubernetes.io/projected/ee9aa6c0-9937-4be5-8ccb-f718ea8400e8-kube-api-access-mpcv7\") pod \"telemeter-client-65b7d86768-x95lb\" (UID: \"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8\") " pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" Apr 28 19:20:07.215697 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.215574 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ee9aa6c0-9937-4be5-8ccb-f718ea8400e8-federate-client-tls\") pod \"telemeter-client-65b7d86768-x95lb\" (UID: \"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8\") " pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" Apr 28 19:20:07.215697 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.215600 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ee9aa6c0-9937-4be5-8ccb-f718ea8400e8-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-65b7d86768-x95lb\" (UID: \"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8\") " pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" Apr 28 19:20:07.215697 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.215628 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ee9aa6c0-9937-4be5-8ccb-f718ea8400e8-telemeter-client-tls\") pod \"telemeter-client-65b7d86768-x95lb\" (UID: \"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8\") " pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" Apr 28 19:20:07.316394 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.316308 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ee9aa6c0-9937-4be5-8ccb-f718ea8400e8-metrics-client-ca\") pod \"telemeter-client-65b7d86768-x95lb\" (UID: \"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8\") " pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" Apr 28 19:20:07.316394 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.316351 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee9aa6c0-9937-4be5-8ccb-f718ea8400e8-serving-certs-ca-bundle\") pod \"telemeter-client-65b7d86768-x95lb\" (UID: \"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8\") " pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" Apr 28 19:20:07.316394 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.316381 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mpcv7\" (UniqueName: \"kubernetes.io/projected/ee9aa6c0-9937-4be5-8ccb-f718ea8400e8-kube-api-access-mpcv7\") pod \"telemeter-client-65b7d86768-x95lb\" (UID: \"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8\") " pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" Apr 28 19:20:07.316644 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.316415 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ee9aa6c0-9937-4be5-8ccb-f718ea8400e8-federate-client-tls\") pod \"telemeter-client-65b7d86768-x95lb\" (UID: \"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8\") " pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" Apr 28 19:20:07.316644 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.316449 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ee9aa6c0-9937-4be5-8ccb-f718ea8400e8-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-65b7d86768-x95lb\" (UID: \"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8\") " pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" Apr 28 19:20:07.316644 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.316478 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ee9aa6c0-9937-4be5-8ccb-f718ea8400e8-telemeter-client-tls\") pod \"telemeter-client-65b7d86768-x95lb\" (UID: \"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8\") " pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" Apr 28 19:20:07.316644 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.316557 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee9aa6c0-9937-4be5-8ccb-f718ea8400e8-telemeter-trusted-ca-bundle\") pod \"telemeter-client-65b7d86768-x95lb\" (UID: \"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8\") " pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" Apr 28 19:20:07.316644 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.316612 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ee9aa6c0-9937-4be5-8ccb-f718ea8400e8-secret-telemeter-client\") pod \"telemeter-client-65b7d86768-x95lb\" (UID: \"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8\") " pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" Apr 28 19:20:07.317215 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.317133 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ee9aa6c0-9937-4be5-8ccb-f718ea8400e8-metrics-client-ca\") pod \"telemeter-client-65b7d86768-x95lb\" (UID: \"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8\") " pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" Apr 28 19:20:07.317215 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.317190 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee9aa6c0-9937-4be5-8ccb-f718ea8400e8-serving-certs-ca-bundle\") pod \"telemeter-client-65b7d86768-x95lb\" (UID: \"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8\") " pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" Apr 28 19:20:07.317526 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.317503 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee9aa6c0-9937-4be5-8ccb-f718ea8400e8-telemeter-trusted-ca-bundle\") pod \"telemeter-client-65b7d86768-x95lb\" (UID: \"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8\") " pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" Apr 28 19:20:07.319039 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.319017 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ee9aa6c0-9937-4be5-8ccb-f718ea8400e8-secret-telemeter-client\") pod \"telemeter-client-65b7d86768-x95lb\" (UID: \"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8\") " pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" Apr 28 19:20:07.319140 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.319115 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ee9aa6c0-9937-4be5-8ccb-f718ea8400e8-telemeter-client-tls\") pod \"telemeter-client-65b7d86768-x95lb\" (UID: \"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8\") " pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" Apr 28 19:20:07.319267 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.319251 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ee9aa6c0-9937-4be5-8ccb-f718ea8400e8-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-65b7d86768-x95lb\" (UID: \"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8\") " pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" Apr 28 19:20:07.319355 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.319337 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ee9aa6c0-9937-4be5-8ccb-f718ea8400e8-federate-client-tls\") pod \"telemeter-client-65b7d86768-x95lb\" (UID: \"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8\") " pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" Apr 28 19:20:07.334316 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.334294 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpcv7\" (UniqueName: \"kubernetes.io/projected/ee9aa6c0-9937-4be5-8ccb-f718ea8400e8-kube-api-access-mpcv7\") pod \"telemeter-client-65b7d86768-x95lb\" (UID: \"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8\") " pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" Apr 28 19:20:07.376188 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.376147 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" Apr 28 19:20:07.499110 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.499081 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-65b7d86768-x95lb"] Apr 28 19:20:07.501773 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:20:07.501738 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee9aa6c0_9937_4be5_8ccb_f718ea8400e8.slice/crio-7f368c816390b40b24794463de99749ceea9f01f8a7bc2f68f1edc996f1c01eb WatchSource:0}: Error finding container 7f368c816390b40b24794463de99749ceea9f01f8a7bc2f68f1edc996f1c01eb: Status 404 returned error can't find the container with id 7f368c816390b40b24794463de99749ceea9f01f8a7bc2f68f1edc996f1c01eb Apr 28 19:20:07.686701 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:07.686612 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" event={"ID":"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8","Type":"ContainerStarted","Data":"7f368c816390b40b24794463de99749ceea9f01f8a7bc2f68f1edc996f1c01eb"} Apr 28 19:20:08.236208 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.236173 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:20:08.263384 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.263356 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.274104 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.274072 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 28 19:20:08.288365 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.288151 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 28 19:20:08.288365 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.288242 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 28 19:20:08.288365 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.288247 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 28 19:20:08.288365 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.288303 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 28 19:20:08.289530 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.289493 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 28 19:20:08.289645 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.289572 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-45fh9\"" Apr 28 19:20:08.289645 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.289575 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 28 19:20:08.289747 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.289703 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 28 19:20:08.290009 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.289992 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 28 19:20:08.290978 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.290957 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 28 19:20:08.291703 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.291681 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 28 19:20:08.292225 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.292208 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 28 19:20:08.295452 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.295431 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:20:08.306878 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.306859 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-5k98pnjcp4132\"" Apr 28 19:20:08.307776 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.307759 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 28 19:20:08.425588 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.425555 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.425588 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.425589 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.425788 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.425629 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-config\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.425788 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.425653 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.425788 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.425680 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.425788 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.425760 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.425986 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.425791 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.425986 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.425859 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.425986 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.425940 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.426111 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.425984 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.426111 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.426035 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.426111 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.426078 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.426111 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.426097 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.426352 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.426127 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-web-config\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.426352 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.426152 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9wg7\" (UniqueName: \"kubernetes.io/projected/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-kube-api-access-m9wg7\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.426352 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.426228 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.426352 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.426262 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-config-out\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.426352 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.426284 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.527413 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.527381 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-config\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.527413 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.527417 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.527657 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.527440 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.527657 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.527470 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.527657 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.527496 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.527657 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.527513 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.527657 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.527562 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.527657 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.527580 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.527657 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.527600 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.527657 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.527630 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.527657 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.527646 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.528133 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.527668 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-web-config\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.528133 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.527683 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m9wg7\" (UniqueName: \"kubernetes.io/projected/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-kube-api-access-m9wg7\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.528133 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.527707 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.528133 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.527728 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-config-out\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.528133 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.527759 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.528133 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.527782 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.528133 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.527797 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.528577 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.528461 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.528635 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.528586 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.533033 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.529354 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.533033 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.529468 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.533033 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.532276 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.533033 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.532324 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.533033 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.532757 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.533033 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.532784 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.533033 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.533028 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-config\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.541958 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.541939 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-config-out\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.542181 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.542140 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.542338 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.542294 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.542735 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.542628 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.542735 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.542712 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.542855 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.542754 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.542953 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.542934 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-web-config\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.543315 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.543296 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.544646 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.544625 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9wg7\" (UniqueName: \"kubernetes.io/projected/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-kube-api-access-m9wg7\") pod \"prometheus-k8s-0\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.575472 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.575446 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:08.719154 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:08.719123 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:20:08.722155 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:20:08.722130 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e48ad42_00d3_42d2_9cc3_85e86bee57c1.slice/crio-93c1d59a63d6d74623034acf44cddb4d893fdb4716d826ac1bf47586d79835e8 WatchSource:0}: Error finding container 93c1d59a63d6d74623034acf44cddb4d893fdb4716d826ac1bf47586d79835e8: Status 404 returned error can't find the container with id 93c1d59a63d6d74623034acf44cddb4d893fdb4716d826ac1bf47586d79835e8 Apr 28 19:20:09.695415 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:09.695370 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7e48ad42-00d3-42d2-9cc3-85e86bee57c1","Type":"ContainerStarted","Data":"93c1d59a63d6d74623034acf44cddb4d893fdb4716d826ac1bf47586d79835e8"} Apr 28 19:20:10.699858 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:10.699831 2565 generic.go:358] "Generic (PLEG): container finished" podID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerID="d776c768cdb292e35488192bbcdf404fc75fca8ee870f5ef26741cf8093d9d2d" exitCode=0 Apr 28 19:20:10.700181 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:10.699910 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7e48ad42-00d3-42d2-9cc3-85e86bee57c1","Type":"ContainerDied","Data":"d776c768cdb292e35488192bbcdf404fc75fca8ee870f5ef26741cf8093d9d2d"} Apr 28 19:20:10.701348 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:10.701310 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" event={"ID":"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8","Type":"ContainerStarted","Data":"d41136769c21bff32c8fad7cb628d1fd7c479b6e4504f752cbebbcec8061eb63"} Apr 28 19:20:11.706855 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:11.706813 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" event={"ID":"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8","Type":"ContainerStarted","Data":"67bf2859a727ca8926095f3b97e39dfdc766747c23afa4de6125fc4f13a46fcb"} Apr 28 19:20:11.706855 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:11.706861 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" event={"ID":"ee9aa6c0-9937-4be5-8ccb-f718ea8400e8","Type":"ContainerStarted","Data":"0774f45937b6f0df6313216843265d98fb10e591ea2120f04c70c841ed72ba2f"} Apr 28 19:20:11.741174 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:11.741101 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-65b7d86768-x95lb" podStartSLOduration=1.728224629 podStartE2EDuration="4.741079729s" podCreationTimestamp="2026-04-28 19:20:07 +0000 UTC" firstStartedPulling="2026-04-28 19:20:07.503602581 +0000 UTC m=+192.009319225" lastFinishedPulling="2026-04-28 19:20:10.516457666 +0000 UTC m=+195.022174325" observedRunningTime="2026-04-28 19:20:11.738923138 +0000 UTC m=+196.244639816" watchObservedRunningTime="2026-04-28 19:20:11.741079729 +0000 UTC m=+196.246796399" Apr 28 19:20:14.717783 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:14.717745 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7e48ad42-00d3-42d2-9cc3-85e86bee57c1","Type":"ContainerStarted","Data":"1c6407949ccab1c6481fa8fbcfecebe49f335ed66d90cb0b143738c646466252"} Apr 28 19:20:14.717783 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:14.717784 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7e48ad42-00d3-42d2-9cc3-85e86bee57c1","Type":"ContainerStarted","Data":"7623b6ae64ff7e04d293984aaf34b4eab5a87eb6a780468607a5f8a6d652b963"} Apr 28 19:20:17.729715 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:17.729680 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7e48ad42-00d3-42d2-9cc3-85e86bee57c1","Type":"ContainerStarted","Data":"3000cf55689c7585c766943b5a0e4ab04814c8fe3184670a7d449f17e10460e6"} Apr 28 19:20:17.729715 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:17.729718 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7e48ad42-00d3-42d2-9cc3-85e86bee57c1","Type":"ContainerStarted","Data":"05189f6fa007d13d57b00b89a738b683638f3b6e2b2e5d0239ffe8eba29dc9d4"} Apr 28 19:20:17.730220 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:17.729727 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7e48ad42-00d3-42d2-9cc3-85e86bee57c1","Type":"ContainerStarted","Data":"d56abd819cc88129059666849b1d787f56ac85900d878cabf0f07d3069a73978"} Apr 28 19:20:17.730220 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:17.729736 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7e48ad42-00d3-42d2-9cc3-85e86bee57c1","Type":"ContainerStarted","Data":"d38e9decd4039e70841de7cd717bfecd671bd79dd6c9cd27aa8be067d0ef9db8"} Apr 28 19:20:18.575966 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:18.575927 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:26.677343 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:26.677286 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-86bb844cdf-tq25p" podUID="7effa71b-c418-4208-b242-7ebd04c719b2" containerName="console" containerID="cri-o://3eaf7d438f4ad129f896c370249bfb648ac85cac0e45be5b308d3dc648a429f0" gracePeriod=15 Apr 28 19:20:26.759842 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:26.759809 2565 generic.go:358] "Generic (PLEG): container finished" podID="533b151b-982f-4582-9292-150aa20dc9df" containerID="8272e3b996e379968aa7d30f5add2db4d5575d9f32265f12c0c90daf6a5c517c" exitCode=0 Apr 28 19:20:26.759960 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:26.759854 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tnqkx" event={"ID":"533b151b-982f-4582-9292-150aa20dc9df","Type":"ContainerDied","Data":"8272e3b996e379968aa7d30f5add2db4d5575d9f32265f12c0c90daf6a5c517c"} Apr 28 19:20:26.760124 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:26.760111 2565 scope.go:117] "RemoveContainer" containerID="8272e3b996e379968aa7d30f5add2db4d5575d9f32265f12c0c90daf6a5c517c" Apr 28 19:20:26.778859 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:26.778819 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=10.845336469 podStartE2EDuration="18.778805795s" podCreationTimestamp="2026-04-28 19:20:08 +0000 UTC" firstStartedPulling="2026-04-28 19:20:08.724029206 +0000 UTC m=+193.229745850" lastFinishedPulling="2026-04-28 19:20:16.65749853 +0000 UTC m=+201.163215176" observedRunningTime="2026-04-28 19:20:17.800121183 +0000 UTC m=+202.305837848" watchObservedRunningTime="2026-04-28 19:20:26.778805795 +0000 UTC m=+211.284522461" Apr 28 19:20:26.925801 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:26.925781 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86bb844cdf-tq25p_7effa71b-c418-4208-b242-7ebd04c719b2/console/0.log" Apr 28 19:20:26.925926 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:26.925849 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86bb844cdf-tq25p" Apr 28 19:20:26.993201 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:26.993100 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7effa71b-c418-4208-b242-7ebd04c719b2-console-serving-cert\") pod \"7effa71b-c418-4208-b242-7ebd04c719b2\" (UID: \"7effa71b-c418-4208-b242-7ebd04c719b2\") " Apr 28 19:20:26.993201 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:26.993145 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7effa71b-c418-4208-b242-7ebd04c719b2-oauth-serving-cert\") pod \"7effa71b-c418-4208-b242-7ebd04c719b2\" (UID: \"7effa71b-c418-4208-b242-7ebd04c719b2\") " Apr 28 19:20:26.993411 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:26.993203 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7effa71b-c418-4208-b242-7ebd04c719b2-console-oauth-config\") pod \"7effa71b-c418-4208-b242-7ebd04c719b2\" (UID: \"7effa71b-c418-4208-b242-7ebd04c719b2\") " Apr 28 19:20:26.993411 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:26.993247 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7effa71b-c418-4208-b242-7ebd04c719b2-service-ca\") pod \"7effa71b-c418-4208-b242-7ebd04c719b2\" (UID: \"7effa71b-c418-4208-b242-7ebd04c719b2\") " Apr 28 19:20:26.993411 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:26.993274 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7effa71b-c418-4208-b242-7ebd04c719b2-console-config\") pod \"7effa71b-c418-4208-b242-7ebd04c719b2\" (UID: \"7effa71b-c418-4208-b242-7ebd04c719b2\") " Apr 28 19:20:26.993411 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:26.993318 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69wn6\" (UniqueName: \"kubernetes.io/projected/7effa71b-c418-4208-b242-7ebd04c719b2-kube-api-access-69wn6\") pod \"7effa71b-c418-4208-b242-7ebd04c719b2\" (UID: \"7effa71b-c418-4208-b242-7ebd04c719b2\") " Apr 28 19:20:26.993612 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:26.993583 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7effa71b-c418-4208-b242-7ebd04c719b2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7effa71b-c418-4208-b242-7ebd04c719b2" (UID: "7effa71b-c418-4208-b242-7ebd04c719b2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:20:26.993846 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:26.993820 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7effa71b-c418-4208-b242-7ebd04c719b2-service-ca" (OuterVolumeSpecName: "service-ca") pod "7effa71b-c418-4208-b242-7ebd04c719b2" (UID: "7effa71b-c418-4208-b242-7ebd04c719b2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:20:26.993918 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:26.993821 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7effa71b-c418-4208-b242-7ebd04c719b2-console-config" (OuterVolumeSpecName: "console-config") pod "7effa71b-c418-4208-b242-7ebd04c719b2" (UID: "7effa71b-c418-4208-b242-7ebd04c719b2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:20:26.995562 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:26.995537 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7effa71b-c418-4208-b242-7ebd04c719b2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7effa71b-c418-4208-b242-7ebd04c719b2" (UID: "7effa71b-c418-4208-b242-7ebd04c719b2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:20:26.995562 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:26.995556 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7effa71b-c418-4208-b242-7ebd04c719b2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7effa71b-c418-4208-b242-7ebd04c719b2" (UID: "7effa71b-c418-4208-b242-7ebd04c719b2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:20:26.995705 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:26.995624 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7effa71b-c418-4208-b242-7ebd04c719b2-kube-api-access-69wn6" (OuterVolumeSpecName: "kube-api-access-69wn6") pod "7effa71b-c418-4208-b242-7ebd04c719b2" (UID: "7effa71b-c418-4208-b242-7ebd04c719b2"). InnerVolumeSpecName "kube-api-access-69wn6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:20:27.094257 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:27.094218 2565 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7effa71b-c418-4208-b242-7ebd04c719b2-service-ca\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:20:27.094257 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:27.094250 2565 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7effa71b-c418-4208-b242-7ebd04c719b2-console-config\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:20:27.094257 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:27.094259 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-69wn6\" (UniqueName: \"kubernetes.io/projected/7effa71b-c418-4208-b242-7ebd04c719b2-kube-api-access-69wn6\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:20:27.094480 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:27.094269 2565 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7effa71b-c418-4208-b242-7ebd04c719b2-console-serving-cert\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:20:27.094480 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:27.094280 2565 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7effa71b-c418-4208-b242-7ebd04c719b2-oauth-serving-cert\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:20:27.094480 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:27.094289 2565 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7effa71b-c418-4208-b242-7ebd04c719b2-console-oauth-config\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:20:27.765081 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:27.765044 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tnqkx" event={"ID":"533b151b-982f-4582-9292-150aa20dc9df","Type":"ContainerStarted","Data":"f84bd05039f4a1501016626bb1c546c1f47de3ebf3dcef25d8e9b80a709913a8"} Apr 28 19:20:27.766230 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:27.766212 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86bb844cdf-tq25p_7effa71b-c418-4208-b242-7ebd04c719b2/console/0.log" Apr 28 19:20:27.766336 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:27.766246 2565 generic.go:358] "Generic (PLEG): container finished" podID="7effa71b-c418-4208-b242-7ebd04c719b2" containerID="3eaf7d438f4ad129f896c370249bfb648ac85cac0e45be5b308d3dc648a429f0" exitCode=2 Apr 28 19:20:27.766336 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:27.766297 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86bb844cdf-tq25p" Apr 28 19:20:27.766336 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:27.766313 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86bb844cdf-tq25p" event={"ID":"7effa71b-c418-4208-b242-7ebd04c719b2","Type":"ContainerDied","Data":"3eaf7d438f4ad129f896c370249bfb648ac85cac0e45be5b308d3dc648a429f0"} Apr 28 19:20:27.766434 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:27.766337 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86bb844cdf-tq25p" event={"ID":"7effa71b-c418-4208-b242-7ebd04c719b2","Type":"ContainerDied","Data":"91415b4405447403696c5e344dff15f2c7d14d2d330f816220d1fce23462a7fd"} Apr 28 19:20:27.766434 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:27.766352 2565 scope.go:117] "RemoveContainer" containerID="3eaf7d438f4ad129f896c370249bfb648ac85cac0e45be5b308d3dc648a429f0" Apr 28 19:20:27.775143 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:27.775126 2565 scope.go:117] "RemoveContainer" containerID="3eaf7d438f4ad129f896c370249bfb648ac85cac0e45be5b308d3dc648a429f0" Apr 28 19:20:27.775422 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:20:27.775398 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eaf7d438f4ad129f896c370249bfb648ac85cac0e45be5b308d3dc648a429f0\": container with ID starting with 3eaf7d438f4ad129f896c370249bfb648ac85cac0e45be5b308d3dc648a429f0 not found: ID does not exist" containerID="3eaf7d438f4ad129f896c370249bfb648ac85cac0e45be5b308d3dc648a429f0" Apr 28 19:20:27.775504 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:27.775430 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eaf7d438f4ad129f896c370249bfb648ac85cac0e45be5b308d3dc648a429f0"} err="failed to get container status \"3eaf7d438f4ad129f896c370249bfb648ac85cac0e45be5b308d3dc648a429f0\": rpc error: code = NotFound desc = could not find container \"3eaf7d438f4ad129f896c370249bfb648ac85cac0e45be5b308d3dc648a429f0\": container with ID starting with 3eaf7d438f4ad129f896c370249bfb648ac85cac0e45be5b308d3dc648a429f0 not found: ID does not exist" Apr 28 19:20:27.821228 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:27.821199 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86bb844cdf-tq25p"] Apr 28 19:20:27.829804 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:27.829776 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-86bb844cdf-tq25p"] Apr 28 19:20:28.110376 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:20:28.110302 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7effa71b-c418-4208-b242-7ebd04c719b2" path="/var/lib/kubelet/pods/7effa71b-c418-4208-b242-7ebd04c719b2/volumes" Apr 28 19:21:07.838398 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:07.838359 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs\") pod \"network-metrics-daemon-hndjc\" (UID: \"cbe36bec-c099-4625-b8c9-eb67c281b442\") " pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:21:07.840618 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:07.840599 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cbe36bec-c099-4625-b8c9-eb67c281b442-metrics-certs\") pod \"network-metrics-daemon-hndjc\" (UID: \"cbe36bec-c099-4625-b8c9-eb67c281b442\") " pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:21:08.014774 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:08.014741 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-cnh2p\"" Apr 28 19:21:08.023077 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:08.023056 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hndjc" Apr 28 19:21:08.142049 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:08.141910 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hndjc"] Apr 28 19:21:08.144678 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:21:08.144645 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbe36bec_c099_4625_b8c9_eb67c281b442.slice/crio-d21e3dd64cd497e78cff443e3e3b4576f35033c7a0d1be2923355fabe4af2ecd WatchSource:0}: Error finding container d21e3dd64cd497e78cff443e3e3b4576f35033c7a0d1be2923355fabe4af2ecd: Status 404 returned error can't find the container with id d21e3dd64cd497e78cff443e3e3b4576f35033c7a0d1be2923355fabe4af2ecd Apr 28 19:21:08.576583 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:08.576551 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:08.591741 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:08.591714 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:08.883704 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:08.883612 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hndjc" event={"ID":"cbe36bec-c099-4625-b8c9-eb67c281b442","Type":"ContainerStarted","Data":"d21e3dd64cd497e78cff443e3e3b4576f35033c7a0d1be2923355fabe4af2ecd"} Apr 28 19:21:08.901068 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:08.901044 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:09.887515 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:09.887477 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hndjc" event={"ID":"cbe36bec-c099-4625-b8c9-eb67c281b442","Type":"ContainerStarted","Data":"46526ac1ed86da49912bad838dc1b94181881d295152258d709efcdae1e81755"} Apr 28 19:21:09.887515 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:09.887513 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hndjc" event={"ID":"cbe36bec-c099-4625-b8c9-eb67c281b442","Type":"ContainerStarted","Data":"b497f9dacae3b87c02f388df2df7daced0ddd9246907b134b21f8b93ff1a3c09"} Apr 28 19:21:09.916302 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:09.916257 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-hndjc" podStartSLOduration=252.717910612 podStartE2EDuration="4m13.916241495s" podCreationTimestamp="2026-04-28 19:16:56 +0000 UTC" firstStartedPulling="2026-04-28 19:21:08.14630501 +0000 UTC m=+252.652021654" lastFinishedPulling="2026-04-28 19:21:09.344635893 +0000 UTC m=+253.850352537" observedRunningTime="2026-04-28 19:21:09.915504111 +0000 UTC m=+254.421220787" watchObservedRunningTime="2026-04-28 19:21:09.916241495 +0000 UTC m=+254.421958161" Apr 28 19:21:26.621951 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:26.621916 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:21:26.622409 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:26.622376 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerName="prometheus" containerID="cri-o://7623b6ae64ff7e04d293984aaf34b4eab5a87eb6a780468607a5f8a6d652b963" gracePeriod=600 Apr 28 19:21:26.622542 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:26.622445 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerName="kube-rbac-proxy-web" containerID="cri-o://d56abd819cc88129059666849b1d787f56ac85900d878cabf0f07d3069a73978" gracePeriod=600 Apr 28 19:21:26.622610 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:26.622462 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerName="config-reloader" containerID="cri-o://1c6407949ccab1c6481fa8fbcfecebe49f335ed66d90cb0b143738c646466252" gracePeriod=600 Apr 28 19:21:26.622610 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:26.622405 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerName="kube-rbac-proxy" containerID="cri-o://05189f6fa007d13d57b00b89a738b683638f3b6e2b2e5d0239ffe8eba29dc9d4" gracePeriod=600 Apr 28 19:21:26.622712 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:26.622444 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerName="thanos-sidecar" containerID="cri-o://d38e9decd4039e70841de7cd717bfecd671bd79dd6c9cd27aa8be067d0ef9db8" gracePeriod=600 Apr 28 19:21:26.622712 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:26.622437 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerName="kube-rbac-proxy-thanos" containerID="cri-o://3000cf55689c7585c766943b5a0e4ab04814c8fe3184670a7d449f17e10460e6" gracePeriod=600 Apr 28 19:21:26.941434 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:26.941336 2565 generic.go:358] "Generic (PLEG): container finished" podID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerID="3000cf55689c7585c766943b5a0e4ab04814c8fe3184670a7d449f17e10460e6" exitCode=0 Apr 28 19:21:26.941434 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:26.941369 2565 generic.go:358] "Generic (PLEG): container finished" podID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerID="05189f6fa007d13d57b00b89a738b683638f3b6e2b2e5d0239ffe8eba29dc9d4" exitCode=0 Apr 28 19:21:26.941434 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:26.941376 2565 generic.go:358] "Generic (PLEG): container finished" podID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerID="d38e9decd4039e70841de7cd717bfecd671bd79dd6c9cd27aa8be067d0ef9db8" exitCode=0 Apr 28 19:21:26.941434 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:26.941382 2565 generic.go:358] "Generic (PLEG): container finished" podID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerID="1c6407949ccab1c6481fa8fbcfecebe49f335ed66d90cb0b143738c646466252" exitCode=0 Apr 28 19:21:26.941434 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:26.941388 2565 generic.go:358] "Generic (PLEG): container finished" podID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerID="7623b6ae64ff7e04d293984aaf34b4eab5a87eb6a780468607a5f8a6d652b963" exitCode=0 Apr 28 19:21:26.941434 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:26.941405 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7e48ad42-00d3-42d2-9cc3-85e86bee57c1","Type":"ContainerDied","Data":"3000cf55689c7585c766943b5a0e4ab04814c8fe3184670a7d449f17e10460e6"} Apr 28 19:21:26.941757 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:26.941451 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7e48ad42-00d3-42d2-9cc3-85e86bee57c1","Type":"ContainerDied","Data":"05189f6fa007d13d57b00b89a738b683638f3b6e2b2e5d0239ffe8eba29dc9d4"} Apr 28 19:21:26.941757 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:26.941464 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7e48ad42-00d3-42d2-9cc3-85e86bee57c1","Type":"ContainerDied","Data":"d38e9decd4039e70841de7cd717bfecd671bd79dd6c9cd27aa8be067d0ef9db8"} Apr 28 19:21:26.941757 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:26.941476 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7e48ad42-00d3-42d2-9cc3-85e86bee57c1","Type":"ContainerDied","Data":"1c6407949ccab1c6481fa8fbcfecebe49f335ed66d90cb0b143738c646466252"} Apr 28 19:21:26.941757 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:26.941488 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7e48ad42-00d3-42d2-9cc3-85e86bee57c1","Type":"ContainerDied","Data":"7623b6ae64ff7e04d293984aaf34b4eab5a87eb6a780468607a5f8a6d652b963"} Apr 28 19:21:27.859803 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.859782 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:27.906485 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.906402 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-prometheus-k8s-tls\") pod \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " Apr 28 19:21:27.906485 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.906456 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " Apr 28 19:21:27.906697 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.906489 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-metrics-client-certs\") pod \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " Apr 28 19:21:27.906697 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.906516 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-config-out\") pod \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " Apr 28 19:21:27.906697 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.906542 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-tls-assets\") pod \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " Apr 28 19:21:27.906697 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.906570 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-configmap-metrics-client-ca\") pod \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " Apr 28 19:21:27.906697 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.906595 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-configmap-kubelet-serving-ca-bundle\") pod \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " Apr 28 19:21:27.906697 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.906622 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-thanos-prometheus-http-client-file\") pod \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " Apr 28 19:21:27.906697 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.906651 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " Apr 28 19:21:27.906697 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.906677 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-grpc-tls\") pod \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " Apr 28 19:21:27.907097 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.906714 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-prometheus-trusted-ca-bundle\") pod \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " Apr 28 19:21:27.907097 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.906763 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9wg7\" (UniqueName: \"kubernetes.io/projected/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-kube-api-access-m9wg7\") pod \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " Apr 28 19:21:27.907097 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.906803 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-web-config\") pod \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " Apr 28 19:21:27.907097 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.906833 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-config\") pod \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " Apr 28 19:21:27.907097 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.906858 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-prometheus-k8s-db\") pod \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " Apr 28 19:21:27.907097 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.906908 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-configmap-serving-certs-ca-bundle\") pod \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " Apr 28 19:21:27.907097 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.906950 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-prometheus-k8s-rulefiles-0\") pod \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " Apr 28 19:21:27.907097 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.906986 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-kube-rbac-proxy\") pod \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\" (UID: \"7e48ad42-00d3-42d2-9cc3-85e86bee57c1\") " Apr 28 19:21:27.908155 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.908105 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "7e48ad42-00d3-42d2-9cc3-85e86bee57c1" (UID: "7e48ad42-00d3-42d2-9cc3-85e86bee57c1"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:21:27.909004 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.908692 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "7e48ad42-00d3-42d2-9cc3-85e86bee57c1" (UID: "7e48ad42-00d3-42d2-9cc3-85e86bee57c1"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:21:27.909004 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.908869 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "7e48ad42-00d3-42d2-9cc3-85e86bee57c1" (UID: "7e48ad42-00d3-42d2-9cc3-85e86bee57c1"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:21:27.910037 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.910009 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "7e48ad42-00d3-42d2-9cc3-85e86bee57c1" (UID: "7e48ad42-00d3-42d2-9cc3-85e86bee57c1"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:21:27.910913 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.910887 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "7e48ad42-00d3-42d2-9cc3-85e86bee57c1" (UID: "7e48ad42-00d3-42d2-9cc3-85e86bee57c1"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:21:27.911003 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.910965 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-config-out" (OuterVolumeSpecName: "config-out") pod "7e48ad42-00d3-42d2-9cc3-85e86bee57c1" (UID: "7e48ad42-00d3-42d2-9cc3-85e86bee57c1"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:21:27.911057 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.911016 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "7e48ad42-00d3-42d2-9cc3-85e86bee57c1" (UID: "7e48ad42-00d3-42d2-9cc3-85e86bee57c1"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:21:27.911973 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.911936 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "7e48ad42-00d3-42d2-9cc3-85e86bee57c1" (UID: "7e48ad42-00d3-42d2-9cc3-85e86bee57c1"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:21:27.912809 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.912209 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "7e48ad42-00d3-42d2-9cc3-85e86bee57c1" (UID: "7e48ad42-00d3-42d2-9cc3-85e86bee57c1"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:21:27.912809 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.912556 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "7e48ad42-00d3-42d2-9cc3-85e86bee57c1" (UID: "7e48ad42-00d3-42d2-9cc3-85e86bee57c1"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:21:27.912809 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.912557 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-kube-api-access-m9wg7" (OuterVolumeSpecName: "kube-api-access-m9wg7") pod "7e48ad42-00d3-42d2-9cc3-85e86bee57c1" (UID: "7e48ad42-00d3-42d2-9cc3-85e86bee57c1"). InnerVolumeSpecName "kube-api-access-m9wg7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:21:27.913070 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.912849 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-config" (OuterVolumeSpecName: "config") pod "7e48ad42-00d3-42d2-9cc3-85e86bee57c1" (UID: "7e48ad42-00d3-42d2-9cc3-85e86bee57c1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:21:27.913814 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.913786 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "7e48ad42-00d3-42d2-9cc3-85e86bee57c1" (UID: "7e48ad42-00d3-42d2-9cc3-85e86bee57c1"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:21:27.913905 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.913840 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "7e48ad42-00d3-42d2-9cc3-85e86bee57c1" (UID: "7e48ad42-00d3-42d2-9cc3-85e86bee57c1"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:21:27.913905 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.913854 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "7e48ad42-00d3-42d2-9cc3-85e86bee57c1" (UID: "7e48ad42-00d3-42d2-9cc3-85e86bee57c1"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:21:27.913905 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.913876 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "7e48ad42-00d3-42d2-9cc3-85e86bee57c1" (UID: "7e48ad42-00d3-42d2-9cc3-85e86bee57c1"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:21:27.914260 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.914223 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "7e48ad42-00d3-42d2-9cc3-85e86bee57c1" (UID: "7e48ad42-00d3-42d2-9cc3-85e86bee57c1"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:21:27.927348 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.927153 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-web-config" (OuterVolumeSpecName: "web-config") pod "7e48ad42-00d3-42d2-9cc3-85e86bee57c1" (UID: "7e48ad42-00d3-42d2-9cc3-85e86bee57c1"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:21:27.948411 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.948385 2565 generic.go:358] "Generic (PLEG): container finished" podID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerID="d56abd819cc88129059666849b1d787f56ac85900d878cabf0f07d3069a73978" exitCode=0 Apr 28 19:21:27.948557 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.948461 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7e48ad42-00d3-42d2-9cc3-85e86bee57c1","Type":"ContainerDied","Data":"d56abd819cc88129059666849b1d787f56ac85900d878cabf0f07d3069a73978"} Apr 28 19:21:27.948557 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.948505 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:27.948557 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.948521 2565 scope.go:117] "RemoveContainer" containerID="3000cf55689c7585c766943b5a0e4ab04814c8fe3184670a7d449f17e10460e6" Apr 28 19:21:27.948688 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.948506 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7e48ad42-00d3-42d2-9cc3-85e86bee57c1","Type":"ContainerDied","Data":"93c1d59a63d6d74623034acf44cddb4d893fdb4716d826ac1bf47586d79835e8"} Apr 28 19:21:27.956964 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.956941 2565 scope.go:117] "RemoveContainer" containerID="05189f6fa007d13d57b00b89a738b683638f3b6e2b2e5d0239ffe8eba29dc9d4" Apr 28 19:21:27.963843 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.963825 2565 scope.go:117] "RemoveContainer" containerID="d56abd819cc88129059666849b1d787f56ac85900d878cabf0f07d3069a73978" Apr 28 19:21:27.970019 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.970002 2565 scope.go:117] "RemoveContainer" containerID="d38e9decd4039e70841de7cd717bfecd671bd79dd6c9cd27aa8be067d0ef9db8" Apr 28 19:21:27.976115 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.976096 2565 scope.go:117] "RemoveContainer" containerID="1c6407949ccab1c6481fa8fbcfecebe49f335ed66d90cb0b143738c646466252" Apr 28 19:21:27.982293 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.982274 2565 scope.go:117] "RemoveContainer" containerID="7623b6ae64ff7e04d293984aaf34b4eab5a87eb6a780468607a5f8a6d652b963" Apr 28 19:21:27.988769 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.988748 2565 scope.go:117] "RemoveContainer" containerID="d776c768cdb292e35488192bbcdf404fc75fca8ee870f5ef26741cf8093d9d2d" Apr 28 19:21:27.994825 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.994809 2565 scope.go:117] "RemoveContainer" containerID="3000cf55689c7585c766943b5a0e4ab04814c8fe3184670a7d449f17e10460e6" Apr 28 19:21:27.995049 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:21:27.995032 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3000cf55689c7585c766943b5a0e4ab04814c8fe3184670a7d449f17e10460e6\": container with ID starting with 3000cf55689c7585c766943b5a0e4ab04814c8fe3184670a7d449f17e10460e6 not found: ID does not exist" containerID="3000cf55689c7585c766943b5a0e4ab04814c8fe3184670a7d449f17e10460e6" Apr 28 19:21:27.995108 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.995058 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3000cf55689c7585c766943b5a0e4ab04814c8fe3184670a7d449f17e10460e6"} err="failed to get container status \"3000cf55689c7585c766943b5a0e4ab04814c8fe3184670a7d449f17e10460e6\": rpc error: code = NotFound desc = could not find container \"3000cf55689c7585c766943b5a0e4ab04814c8fe3184670a7d449f17e10460e6\": container with ID starting with 3000cf55689c7585c766943b5a0e4ab04814c8fe3184670a7d449f17e10460e6 not found: ID does not exist" Apr 28 19:21:27.995108 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.995081 2565 scope.go:117] "RemoveContainer" containerID="05189f6fa007d13d57b00b89a738b683638f3b6e2b2e5d0239ffe8eba29dc9d4" Apr 28 19:21:27.995335 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:21:27.995316 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05189f6fa007d13d57b00b89a738b683638f3b6e2b2e5d0239ffe8eba29dc9d4\": container with ID starting with 05189f6fa007d13d57b00b89a738b683638f3b6e2b2e5d0239ffe8eba29dc9d4 not found: ID does not exist" containerID="05189f6fa007d13d57b00b89a738b683638f3b6e2b2e5d0239ffe8eba29dc9d4" Apr 28 19:21:27.995383 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.995340 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05189f6fa007d13d57b00b89a738b683638f3b6e2b2e5d0239ffe8eba29dc9d4"} err="failed to get container status \"05189f6fa007d13d57b00b89a738b683638f3b6e2b2e5d0239ffe8eba29dc9d4\": rpc error: code = NotFound desc = could not find container \"05189f6fa007d13d57b00b89a738b683638f3b6e2b2e5d0239ffe8eba29dc9d4\": container with ID starting with 05189f6fa007d13d57b00b89a738b683638f3b6e2b2e5d0239ffe8eba29dc9d4 not found: ID does not exist" Apr 28 19:21:27.995383 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.995357 2565 scope.go:117] "RemoveContainer" containerID="d56abd819cc88129059666849b1d787f56ac85900d878cabf0f07d3069a73978" Apr 28 19:21:27.995549 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:21:27.995534 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d56abd819cc88129059666849b1d787f56ac85900d878cabf0f07d3069a73978\": container with ID starting with d56abd819cc88129059666849b1d787f56ac85900d878cabf0f07d3069a73978 not found: ID does not exist" containerID="d56abd819cc88129059666849b1d787f56ac85900d878cabf0f07d3069a73978" Apr 28 19:21:27.995587 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.995552 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d56abd819cc88129059666849b1d787f56ac85900d878cabf0f07d3069a73978"} err="failed to get container status \"d56abd819cc88129059666849b1d787f56ac85900d878cabf0f07d3069a73978\": rpc error: code = NotFound desc = could not find container \"d56abd819cc88129059666849b1d787f56ac85900d878cabf0f07d3069a73978\": container with ID starting with d56abd819cc88129059666849b1d787f56ac85900d878cabf0f07d3069a73978 not found: ID does not exist" Apr 28 19:21:27.995587 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.995565 2565 scope.go:117] "RemoveContainer" containerID="d38e9decd4039e70841de7cd717bfecd671bd79dd6c9cd27aa8be067d0ef9db8" Apr 28 19:21:27.995799 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:21:27.995783 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d38e9decd4039e70841de7cd717bfecd671bd79dd6c9cd27aa8be067d0ef9db8\": container with ID starting with d38e9decd4039e70841de7cd717bfecd671bd79dd6c9cd27aa8be067d0ef9db8 not found: ID does not exist" containerID="d38e9decd4039e70841de7cd717bfecd671bd79dd6c9cd27aa8be067d0ef9db8" Apr 28 19:21:27.995858 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.995807 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d38e9decd4039e70841de7cd717bfecd671bd79dd6c9cd27aa8be067d0ef9db8"} err="failed to get container status \"d38e9decd4039e70841de7cd717bfecd671bd79dd6c9cd27aa8be067d0ef9db8\": rpc error: code = NotFound desc = could not find container \"d38e9decd4039e70841de7cd717bfecd671bd79dd6c9cd27aa8be067d0ef9db8\": container with ID starting with d38e9decd4039e70841de7cd717bfecd671bd79dd6c9cd27aa8be067d0ef9db8 not found: ID does not exist" Apr 28 19:21:27.995858 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.995827 2565 scope.go:117] "RemoveContainer" containerID="1c6407949ccab1c6481fa8fbcfecebe49f335ed66d90cb0b143738c646466252" Apr 28 19:21:27.996055 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:21:27.996040 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c6407949ccab1c6481fa8fbcfecebe49f335ed66d90cb0b143738c646466252\": container with ID starting with 1c6407949ccab1c6481fa8fbcfecebe49f335ed66d90cb0b143738c646466252 not found: ID does not exist" containerID="1c6407949ccab1c6481fa8fbcfecebe49f335ed66d90cb0b143738c646466252" Apr 28 19:21:27.996092 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.996059 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c6407949ccab1c6481fa8fbcfecebe49f335ed66d90cb0b143738c646466252"} err="failed to get container status \"1c6407949ccab1c6481fa8fbcfecebe49f335ed66d90cb0b143738c646466252\": rpc error: code = NotFound desc = could not find container \"1c6407949ccab1c6481fa8fbcfecebe49f335ed66d90cb0b143738c646466252\": container with ID starting with 1c6407949ccab1c6481fa8fbcfecebe49f335ed66d90cb0b143738c646466252 not found: ID does not exist" Apr 28 19:21:27.996092 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.996073 2565 scope.go:117] "RemoveContainer" containerID="7623b6ae64ff7e04d293984aaf34b4eab5a87eb6a780468607a5f8a6d652b963" Apr 28 19:21:27.996293 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:21:27.996277 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7623b6ae64ff7e04d293984aaf34b4eab5a87eb6a780468607a5f8a6d652b963\": container with ID starting with 7623b6ae64ff7e04d293984aaf34b4eab5a87eb6a780468607a5f8a6d652b963 not found: ID does not exist" containerID="7623b6ae64ff7e04d293984aaf34b4eab5a87eb6a780468607a5f8a6d652b963" Apr 28 19:21:27.996347 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.996295 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7623b6ae64ff7e04d293984aaf34b4eab5a87eb6a780468607a5f8a6d652b963"} err="failed to get container status \"7623b6ae64ff7e04d293984aaf34b4eab5a87eb6a780468607a5f8a6d652b963\": rpc error: code = NotFound desc = could not find container \"7623b6ae64ff7e04d293984aaf34b4eab5a87eb6a780468607a5f8a6d652b963\": container with ID starting with 7623b6ae64ff7e04d293984aaf34b4eab5a87eb6a780468607a5f8a6d652b963 not found: ID does not exist" Apr 28 19:21:27.996347 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.996306 2565 scope.go:117] "RemoveContainer" containerID="d776c768cdb292e35488192bbcdf404fc75fca8ee870f5ef26741cf8093d9d2d" Apr 28 19:21:27.996491 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:21:27.996475 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d776c768cdb292e35488192bbcdf404fc75fca8ee870f5ef26741cf8093d9d2d\": container with ID starting with d776c768cdb292e35488192bbcdf404fc75fca8ee870f5ef26741cf8093d9d2d not found: ID does not exist" containerID="d776c768cdb292e35488192bbcdf404fc75fca8ee870f5ef26741cf8093d9d2d" Apr 28 19:21:27.996535 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:27.996494 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d776c768cdb292e35488192bbcdf404fc75fca8ee870f5ef26741cf8093d9d2d"} err="failed to get container status \"d776c768cdb292e35488192bbcdf404fc75fca8ee870f5ef26741cf8093d9d2d\": rpc error: code = NotFound desc = could not find container \"d776c768cdb292e35488192bbcdf404fc75fca8ee870f5ef26741cf8093d9d2d\": container with ID starting with d776c768cdb292e35488192bbcdf404fc75fca8ee870f5ef26741cf8093d9d2d not found: ID does not exist" Apr 28 19:21:28.007877 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.007859 2565 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-prometheus-trusted-ca-bundle\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:21:28.007940 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.007879 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m9wg7\" (UniqueName: \"kubernetes.io/projected/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-kube-api-access-m9wg7\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:21:28.007940 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.007890 2565 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-web-config\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:21:28.007940 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.007900 2565 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-config\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:21:28.007940 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.007908 2565 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-prometheus-k8s-db\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:21:28.007940 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.007917 2565 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:21:28.007940 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.007926 2565 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:21:28.007940 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.007935 2565 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-kube-rbac-proxy\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:21:28.008138 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.007945 2565 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-prometheus-k8s-tls\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:21:28.008138 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.007955 2565 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:21:28.008138 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.007965 2565 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-metrics-client-certs\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:21:28.008138 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.007975 2565 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-config-out\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:21:28.008138 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.007983 2565 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-tls-assets\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:21:28.008138 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.007991 2565 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-configmap-metrics-client-ca\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:21:28.008138 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.008000 2565 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:21:28.008138 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.008008 2565 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-thanos-prometheus-http-client-file\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:21:28.008138 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.008017 2565 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:21:28.008138 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.008027 2565 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7e48ad42-00d3-42d2-9cc3-85e86bee57c1-secret-grpc-tls\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:21:28.073205 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.073174 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:21:28.080647 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.080624 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:21:28.110948 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.110912 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" path="/var/lib/kubelet/pods/7e48ad42-00d3-42d2-9cc3-85e86bee57c1/volumes" Apr 28 19:21:28.237703 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.237632 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:21:28.237905 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.237893 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerName="kube-rbac-proxy" Apr 28 19:21:28.237941 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.237906 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerName="kube-rbac-proxy" Apr 28 19:21:28.237941 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.237919 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerName="config-reloader" Apr 28 19:21:28.237941 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.237926 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerName="config-reloader" Apr 28 19:21:28.237941 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.237936 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7effa71b-c418-4208-b242-7ebd04c719b2" containerName="console" Apr 28 19:21:28.237941 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.237941 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7effa71b-c418-4208-b242-7ebd04c719b2" containerName="console" Apr 28 19:21:28.238078 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.237949 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerName="kube-rbac-proxy-web" Apr 28 19:21:28.238078 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.237955 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerName="kube-rbac-proxy-web" Apr 28 19:21:28.238078 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.237961 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerName="thanos-sidecar" Apr 28 19:21:28.238078 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.237967 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerName="thanos-sidecar" Apr 28 19:21:28.238078 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.237976 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerName="init-config-reloader" Apr 28 19:21:28.238078 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.237981 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerName="init-config-reloader" Apr 28 19:21:28.238078 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.237988 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerName="kube-rbac-proxy-thanos" Apr 28 19:21:28.238078 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.237992 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerName="kube-rbac-proxy-thanos" Apr 28 19:21:28.238078 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.237998 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerName="prometheus" Apr 28 19:21:28.238078 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.238003 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerName="prometheus" Apr 28 19:21:28.238078 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.238051 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerName="kube-rbac-proxy-thanos" Apr 28 19:21:28.238078 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.238058 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerName="thanos-sidecar" Apr 28 19:21:28.238078 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.238065 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerName="prometheus" Apr 28 19:21:28.238078 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.238072 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="7effa71b-c418-4208-b242-7ebd04c719b2" containerName="console" Apr 28 19:21:28.238078 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.238078 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerName="kube-rbac-proxy-web" Apr 28 19:21:28.238503 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.238084 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerName="config-reloader" Apr 28 19:21:28.238503 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.238090 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e48ad42-00d3-42d2-9cc3-85e86bee57c1" containerName="kube-rbac-proxy" Apr 28 19:21:28.243427 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.243411 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.246654 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.246630 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 28 19:21:28.246774 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.246706 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 28 19:21:28.246836 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.246823 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 28 19:21:28.246882 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.246856 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 28 19:21:28.249782 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.248444 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 28 19:21:28.249782 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.248706 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-45fh9\"" Apr 28 19:21:28.249782 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.248980 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 28 19:21:28.249782 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.249023 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 28 19:21:28.249782 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.249120 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 28 19:21:28.249782 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.249408 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 28 19:21:28.249782 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.249484 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 28 19:21:28.249782 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.249640 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-5k98pnjcp4132\"" Apr 28 19:21:28.250270 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.250057 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 28 19:21:28.254610 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.254587 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 28 19:21:28.256187 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.256152 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 28 19:21:28.264511 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.264077 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:21:28.310580 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.310549 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.310580 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.310589 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-config-out\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.310774 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.310607 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-config\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.310774 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.310679 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.310774 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.310716 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.310774 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.310736 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.310899 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.310777 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.310899 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.310825 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.310899 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.310849 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.310899 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.310875 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.310899 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.310896 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.311047 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.310918 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.311047 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.310936 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.311047 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.310953 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.311047 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.310969 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-web-config\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.311047 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.311001 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.311047 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.311017 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmj2f\" (UniqueName: \"kubernetes.io/projected/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-kube-api-access-tmj2f\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.311047 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.311040 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.412087 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.412059 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.412236 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.412096 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.412236 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.412117 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.412236 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.412139 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.412236 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.412182 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-web-config\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.412236 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.412210 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.412236 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.412234 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmj2f\" (UniqueName: \"kubernetes.io/projected/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-kube-api-access-tmj2f\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.412505 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.412274 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.412604 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.412567 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.412718 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.412695 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.412797 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.412747 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-config-out\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.412797 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.412775 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-config\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.412890 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.412817 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.412890 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.412849 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.412890 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.412882 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.413031 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.412910 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.413031 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.412939 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.413031 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.412981 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.413031 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.413000 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.413239 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.413055 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.413292 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.413239 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.413929 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.413846 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.415623 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.415327 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-config-out\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.415623 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.415490 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.415623 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.415490 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.415623 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.415600 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-web-config\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.415896 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.415683 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.415947 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.415908 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.416180 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.416135 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.416524 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.416489 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.416609 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.416562 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-config\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.417723 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.417699 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.417830 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.417817 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.417966 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.417950 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.418686 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.418669 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.421250 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.421226 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmj2f\" (UniqueName: \"kubernetes.io/projected/f75adb1e-f78f-4176-bbb0-f18794bdf5ef-kube-api-access-tmj2f\") pod \"prometheus-k8s-0\" (UID: \"f75adb1e-f78f-4176-bbb0-f18794bdf5ef\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.555543 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.555514 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:28.691348 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.691313 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:21:28.693843 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:21:28.693816 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf75adb1e_f78f_4176_bbb0_f18794bdf5ef.slice/crio-5e87242e427f2c7706ff0b021a8473ab34e8ce2c58fafc5fc5ba6e59544bb2f2 WatchSource:0}: Error finding container 5e87242e427f2c7706ff0b021a8473ab34e8ce2c58fafc5fc5ba6e59544bb2f2: Status 404 returned error can't find the container with id 5e87242e427f2c7706ff0b021a8473ab34e8ce2c58fafc5fc5ba6e59544bb2f2 Apr 28 19:21:28.952414 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.952324 2565 generic.go:358] "Generic (PLEG): container finished" podID="f75adb1e-f78f-4176-bbb0-f18794bdf5ef" containerID="66e4da7891415692cf9fa6a1c7c6b17bd823934982c7e48fa0f69ebf2b286aa6" exitCode=0 Apr 28 19:21:28.952836 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.952405 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f75adb1e-f78f-4176-bbb0-f18794bdf5ef","Type":"ContainerDied","Data":"66e4da7891415692cf9fa6a1c7c6b17bd823934982c7e48fa0f69ebf2b286aa6"} Apr 28 19:21:28.952836 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:28.952438 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f75adb1e-f78f-4176-bbb0-f18794bdf5ef","Type":"ContainerStarted","Data":"5e87242e427f2c7706ff0b021a8473ab34e8ce2c58fafc5fc5ba6e59544bb2f2"} Apr 28 19:21:29.960109 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:29.960027 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f75adb1e-f78f-4176-bbb0-f18794bdf5ef","Type":"ContainerStarted","Data":"43a3dc57c8d829e7f0d576121bb3a5cb0e783a0d71407dfa5e3e9889a5215492"} Apr 28 19:21:29.960109 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:29.960061 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f75adb1e-f78f-4176-bbb0-f18794bdf5ef","Type":"ContainerStarted","Data":"9c6e3b010e648cabf4cd1773b91069082760d1cfc08dde8e6c7ed23f4c02dfdc"} Apr 28 19:21:29.960109 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:29.960073 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f75adb1e-f78f-4176-bbb0-f18794bdf5ef","Type":"ContainerStarted","Data":"5826a2f3d7c85e7ad85d28bb37578d40c039c2d98989dc557c4861567624cd8f"} Apr 28 19:21:29.960109 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:29.960082 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f75adb1e-f78f-4176-bbb0-f18794bdf5ef","Type":"ContainerStarted","Data":"1f53a16d8d882fdfdf10ce282e660783212c46291eeb126a38d69557a2cdf7c6"} Apr 28 19:21:29.960109 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:29.960090 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f75adb1e-f78f-4176-bbb0-f18794bdf5ef","Type":"ContainerStarted","Data":"a14eb5370ed6c3faf692a56e6b9905d6cda038e6db92f88433043d73ba7166c4"} Apr 28 19:21:29.960109 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:29.960099 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f75adb1e-f78f-4176-bbb0-f18794bdf5ef","Type":"ContainerStarted","Data":"77f2105f3a71edd8938b7e52bbef65e33235c7b2fe3472c3a42a137d44410916"} Apr 28 19:21:30.026187 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:30.022528 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.022510416 podStartE2EDuration="2.022510416s" podCreationTimestamp="2026-04-28 19:21:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:21:30.022188669 +0000 UTC m=+274.527905335" watchObservedRunningTime="2026-04-28 19:21:30.022510416 +0000 UTC m=+274.528227085" Apr 28 19:21:33.556460 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:33.556423 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:55.997435 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:21:55.997407 2565 kubelet.go:1628] "Image garbage collection succeeded" Apr 28 19:22:14.556779 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:22:14.556742 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-qx6zq"] Apr 28 19:22:14.559958 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:22:14.559943 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qx6zq" Apr 28 19:22:14.564296 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:22:14.564275 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 28 19:22:14.571060 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:22:14.571040 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qx6zq"] Apr 28 19:22:14.589665 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:22:14.589641 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2174f747-901a-4309-a1e8-74f7920485ec-kubelet-config\") pod \"global-pull-secret-syncer-qx6zq\" (UID: \"2174f747-901a-4309-a1e8-74f7920485ec\") " pod="kube-system/global-pull-secret-syncer-qx6zq" Apr 28 19:22:14.589759 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:22:14.589677 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2174f747-901a-4309-a1e8-74f7920485ec-original-pull-secret\") pod \"global-pull-secret-syncer-qx6zq\" (UID: \"2174f747-901a-4309-a1e8-74f7920485ec\") " pod="kube-system/global-pull-secret-syncer-qx6zq" Apr 28 19:22:14.589759 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:22:14.589695 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2174f747-901a-4309-a1e8-74f7920485ec-dbus\") pod \"global-pull-secret-syncer-qx6zq\" (UID: \"2174f747-901a-4309-a1e8-74f7920485ec\") " pod="kube-system/global-pull-secret-syncer-qx6zq" Apr 28 19:22:14.690189 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:22:14.690146 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2174f747-901a-4309-a1e8-74f7920485ec-kubelet-config\") pod \"global-pull-secret-syncer-qx6zq\" (UID: \"2174f747-901a-4309-a1e8-74f7920485ec\") " pod="kube-system/global-pull-secret-syncer-qx6zq" Apr 28 19:22:14.690330 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:22:14.690199 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2174f747-901a-4309-a1e8-74f7920485ec-original-pull-secret\") pod \"global-pull-secret-syncer-qx6zq\" (UID: \"2174f747-901a-4309-a1e8-74f7920485ec\") " pod="kube-system/global-pull-secret-syncer-qx6zq" Apr 28 19:22:14.690330 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:22:14.690228 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2174f747-901a-4309-a1e8-74f7920485ec-dbus\") pod \"global-pull-secret-syncer-qx6zq\" (UID: \"2174f747-901a-4309-a1e8-74f7920485ec\") " pod="kube-system/global-pull-secret-syncer-qx6zq" Apr 28 19:22:14.690330 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:22:14.690264 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2174f747-901a-4309-a1e8-74f7920485ec-kubelet-config\") pod \"global-pull-secret-syncer-qx6zq\" (UID: \"2174f747-901a-4309-a1e8-74f7920485ec\") " pod="kube-system/global-pull-secret-syncer-qx6zq" Apr 28 19:22:14.690440 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:22:14.690419 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2174f747-901a-4309-a1e8-74f7920485ec-dbus\") pod \"global-pull-secret-syncer-qx6zq\" (UID: \"2174f747-901a-4309-a1e8-74f7920485ec\") " pod="kube-system/global-pull-secret-syncer-qx6zq" Apr 28 19:22:14.692494 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:22:14.692467 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2174f747-901a-4309-a1e8-74f7920485ec-original-pull-secret\") pod \"global-pull-secret-syncer-qx6zq\" (UID: \"2174f747-901a-4309-a1e8-74f7920485ec\") " pod="kube-system/global-pull-secret-syncer-qx6zq" Apr 28 19:22:14.868692 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:22:14.868594 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qx6zq" Apr 28 19:22:14.993036 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:22:14.993009 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qx6zq"] Apr 28 19:22:14.995007 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:22:14.994970 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2174f747_901a_4309_a1e8_74f7920485ec.slice/crio-88770c26cd529c7f01da302d26408d4a20f3773c26070436c6d518bbb2b2ed3e WatchSource:0}: Error finding container 88770c26cd529c7f01da302d26408d4a20f3773c26070436c6d518bbb2b2ed3e: Status 404 returned error can't find the container with id 88770c26cd529c7f01da302d26408d4a20f3773c26070436c6d518bbb2b2ed3e Apr 28 19:22:14.996825 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:22:14.996810 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:22:15.097278 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:22:15.097244 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qx6zq" event={"ID":"2174f747-901a-4309-a1e8-74f7920485ec","Type":"ContainerStarted","Data":"88770c26cd529c7f01da302d26408d4a20f3773c26070436c6d518bbb2b2ed3e"} Apr 28 19:22:19.110988 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:22:19.110948 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qx6zq" event={"ID":"2174f747-901a-4309-a1e8-74f7920485ec","Type":"ContainerStarted","Data":"d93bbef5c6991b2940975906263779753be98bec34e7251a3d9bdcf17ac54492"} Apr 28 19:22:19.140950 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:22:19.140902 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-qx6zq" podStartSLOduration=1.164300142 podStartE2EDuration="5.140885465s" podCreationTimestamp="2026-04-28 19:22:14 +0000 UTC" firstStartedPulling="2026-04-28 19:22:14.996938299 +0000 UTC m=+319.502654943" lastFinishedPulling="2026-04-28 19:22:18.973523622 +0000 UTC m=+323.479240266" observedRunningTime="2026-04-28 19:22:19.140367604 +0000 UTC m=+323.646084270" watchObservedRunningTime="2026-04-28 19:22:19.140885465 +0000 UTC m=+323.646602131" Apr 28 19:22:28.556326 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:22:28.556289 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:22:28.571645 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:22:28.571614 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:22:29.153603 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:22:29.153577 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:24:46.175336 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:24:46.175302 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-gknhk"] Apr 28 19:24:46.177388 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:24:46.177369 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-gknhk" Apr 28 19:24:46.180125 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:24:46.180105 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 28 19:24:46.181351 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:24:46.181324 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 28 19:24:46.181546 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:24:46.181423 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-89w55\"" Apr 28 19:24:46.189003 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:24:46.188981 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-gknhk"] Apr 28 19:24:46.244090 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:24:46.244057 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d0b81422-e41c-4510-abb1-bcca8fd80304-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-gknhk\" (UID: \"d0b81422-e41c-4510-abb1-bcca8fd80304\") " pod="cert-manager/cert-manager-cainjector-68b757865b-gknhk" Apr 28 19:24:46.244258 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:24:46.244132 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkhgb\" (UniqueName: \"kubernetes.io/projected/d0b81422-e41c-4510-abb1-bcca8fd80304-kube-api-access-dkhgb\") pod \"cert-manager-cainjector-68b757865b-gknhk\" (UID: \"d0b81422-e41c-4510-abb1-bcca8fd80304\") " pod="cert-manager/cert-manager-cainjector-68b757865b-gknhk" Apr 28 19:24:46.345193 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:24:46.345142 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkhgb\" (UniqueName: \"kubernetes.io/projected/d0b81422-e41c-4510-abb1-bcca8fd80304-kube-api-access-dkhgb\") pod \"cert-manager-cainjector-68b757865b-gknhk\" (UID: \"d0b81422-e41c-4510-abb1-bcca8fd80304\") " pod="cert-manager/cert-manager-cainjector-68b757865b-gknhk" Apr 28 19:24:46.345361 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:24:46.345220 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d0b81422-e41c-4510-abb1-bcca8fd80304-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-gknhk\" (UID: \"d0b81422-e41c-4510-abb1-bcca8fd80304\") " pod="cert-manager/cert-manager-cainjector-68b757865b-gknhk" Apr 28 19:24:46.353519 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:24:46.353493 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d0b81422-e41c-4510-abb1-bcca8fd80304-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-gknhk\" (UID: \"d0b81422-e41c-4510-abb1-bcca8fd80304\") " pod="cert-manager/cert-manager-cainjector-68b757865b-gknhk" Apr 28 19:24:46.353634 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:24:46.353557 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkhgb\" (UniqueName: \"kubernetes.io/projected/d0b81422-e41c-4510-abb1-bcca8fd80304-kube-api-access-dkhgb\") pod \"cert-manager-cainjector-68b757865b-gknhk\" (UID: \"d0b81422-e41c-4510-abb1-bcca8fd80304\") " pod="cert-manager/cert-manager-cainjector-68b757865b-gknhk" Apr 28 19:24:46.502010 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:24:46.501921 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-gknhk" Apr 28 19:24:46.619613 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:24:46.619519 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-gknhk"] Apr 28 19:24:46.622293 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:24:46.622261 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0b81422_e41c_4510_abb1_bcca8fd80304.slice/crio-b7828872c64bc25413826a2c6da8916832c37f64cb08d05d2e102910a96a0b26 WatchSource:0}: Error finding container b7828872c64bc25413826a2c6da8916832c37f64cb08d05d2e102910a96a0b26: Status 404 returned error can't find the container with id b7828872c64bc25413826a2c6da8916832c37f64cb08d05d2e102910a96a0b26 Apr 28 19:24:47.520319 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:24:47.520275 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-gknhk" event={"ID":"d0b81422-e41c-4510-abb1-bcca8fd80304","Type":"ContainerStarted","Data":"b7828872c64bc25413826a2c6da8916832c37f64cb08d05d2e102910a96a0b26"} Apr 28 19:24:50.530204 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:24:50.530153 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-gknhk" event={"ID":"d0b81422-e41c-4510-abb1-bcca8fd80304","Type":"ContainerStarted","Data":"0a749eae66bcd85d6bb422423cead0ba802c58a583443beaf3d43663adb5af1f"} Apr 28 19:24:50.549509 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:24:50.549462 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-gknhk" podStartSLOduration=1.531443899 podStartE2EDuration="4.549448488s" podCreationTimestamp="2026-04-28 19:24:46 +0000 UTC" firstStartedPulling="2026-04-28 19:24:46.624128719 +0000 UTC m=+471.129845367" lastFinishedPulling="2026-04-28 19:24:49.642133313 +0000 UTC m=+474.147849956" observedRunningTime="2026-04-28 19:24:50.547195446 +0000 UTC m=+475.052912112" watchObservedRunningTime="2026-04-28 19:24:50.549448488 +0000 UTC m=+475.055165156" Apr 28 19:25:17.452024 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:17.451991 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-79cf4cb497-8sztj"] Apr 28 19:25:17.454369 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:17.454349 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-8sztj" Apr 28 19:25:17.458951 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:17.458927 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 28 19:25:17.459056 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:17.458974 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 28 19:25:17.459124 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:17.459053 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 28 19:25:17.459208 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:17.459198 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:25:17.459284 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:17.459220 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-z4smj\"" Apr 28 19:25:17.459340 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:17.459281 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 28 19:25:17.470945 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:17.470922 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-79cf4cb497-8sztj"] Apr 28 19:25:17.603810 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:17.603777 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx8wj\" (UniqueName: \"kubernetes.io/projected/e928964b-a243-4db2-8b96-8c4ca7e022f8-kube-api-access-rx8wj\") pod \"lws-controller-manager-79cf4cb497-8sztj\" (UID: \"e928964b-a243-4db2-8b96-8c4ca7e022f8\") " pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-8sztj" Apr 28 19:25:17.603988 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:17.603825 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e928964b-a243-4db2-8b96-8c4ca7e022f8-cert\") pod \"lws-controller-manager-79cf4cb497-8sztj\" (UID: \"e928964b-a243-4db2-8b96-8c4ca7e022f8\") " pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-8sztj" Apr 28 19:25:17.603988 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:17.603885 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e928964b-a243-4db2-8b96-8c4ca7e022f8-manager-config\") pod \"lws-controller-manager-79cf4cb497-8sztj\" (UID: \"e928964b-a243-4db2-8b96-8c4ca7e022f8\") " pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-8sztj" Apr 28 19:25:17.603988 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:17.603963 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e928964b-a243-4db2-8b96-8c4ca7e022f8-metrics-cert\") pod \"lws-controller-manager-79cf4cb497-8sztj\" (UID: \"e928964b-a243-4db2-8b96-8c4ca7e022f8\") " pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-8sztj" Apr 28 19:25:17.705143 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:17.705057 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e928964b-a243-4db2-8b96-8c4ca7e022f8-manager-config\") pod \"lws-controller-manager-79cf4cb497-8sztj\" (UID: \"e928964b-a243-4db2-8b96-8c4ca7e022f8\") " pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-8sztj" Apr 28 19:25:17.705143 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:17.705103 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e928964b-a243-4db2-8b96-8c4ca7e022f8-metrics-cert\") pod \"lws-controller-manager-79cf4cb497-8sztj\" (UID: \"e928964b-a243-4db2-8b96-8c4ca7e022f8\") " pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-8sztj" Apr 28 19:25:17.705143 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:17.705140 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rx8wj\" (UniqueName: \"kubernetes.io/projected/e928964b-a243-4db2-8b96-8c4ca7e022f8-kube-api-access-rx8wj\") pod \"lws-controller-manager-79cf4cb497-8sztj\" (UID: \"e928964b-a243-4db2-8b96-8c4ca7e022f8\") " pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-8sztj" Apr 28 19:25:17.705379 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:17.705195 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e928964b-a243-4db2-8b96-8c4ca7e022f8-cert\") pod \"lws-controller-manager-79cf4cb497-8sztj\" (UID: \"e928964b-a243-4db2-8b96-8c4ca7e022f8\") " pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-8sztj" Apr 28 19:25:17.705731 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:17.705703 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e928964b-a243-4db2-8b96-8c4ca7e022f8-manager-config\") pod \"lws-controller-manager-79cf4cb497-8sztj\" (UID: \"e928964b-a243-4db2-8b96-8c4ca7e022f8\") " pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-8sztj" Apr 28 19:25:17.707619 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:17.707595 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e928964b-a243-4db2-8b96-8c4ca7e022f8-cert\") pod \"lws-controller-manager-79cf4cb497-8sztj\" (UID: \"e928964b-a243-4db2-8b96-8c4ca7e022f8\") " pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-8sztj" Apr 28 19:25:17.707720 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:17.707638 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e928964b-a243-4db2-8b96-8c4ca7e022f8-metrics-cert\") pod \"lws-controller-manager-79cf4cb497-8sztj\" (UID: \"e928964b-a243-4db2-8b96-8c4ca7e022f8\") " pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-8sztj" Apr 28 19:25:17.719069 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:17.719049 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx8wj\" (UniqueName: \"kubernetes.io/projected/e928964b-a243-4db2-8b96-8c4ca7e022f8-kube-api-access-rx8wj\") pod \"lws-controller-manager-79cf4cb497-8sztj\" (UID: \"e928964b-a243-4db2-8b96-8c4ca7e022f8\") " pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-8sztj" Apr 28 19:25:17.764062 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:17.764043 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-8sztj" Apr 28 19:25:17.912378 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:17.912351 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-79cf4cb497-8sztj"] Apr 28 19:25:17.914141 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:25:17.914114 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode928964b_a243_4db2_8b96_8c4ca7e022f8.slice/crio-9d053401aa2d8e0072ceccadab52233698e6bc28a734df54ab2825ce86e90168 WatchSource:0}: Error finding container 9d053401aa2d8e0072ceccadab52233698e6bc28a734df54ab2825ce86e90168: Status 404 returned error can't find the container with id 9d053401aa2d8e0072ceccadab52233698e6bc28a734df54ab2825ce86e90168 Apr 28 19:25:18.609664 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:18.609629 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-8sztj" event={"ID":"e928964b-a243-4db2-8b96-8c4ca7e022f8","Type":"ContainerStarted","Data":"9d053401aa2d8e0072ceccadab52233698e6bc28a734df54ab2825ce86e90168"} Apr 28 19:25:20.621791 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:20.621753 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-8sztj" event={"ID":"e928964b-a243-4db2-8b96-8c4ca7e022f8","Type":"ContainerStarted","Data":"8d6abdccf575fb76ad5413dda69a0dae33672aa845bd151ba51f8ec30c92e4a8"} Apr 28 19:25:20.622147 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:20.621875 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-8sztj" Apr 28 19:25:20.642358 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:20.642312 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-8sztj" podStartSLOduration=1.039833888 podStartE2EDuration="3.642297307s" podCreationTimestamp="2026-04-28 19:25:17 +0000 UTC" firstStartedPulling="2026-04-28 19:25:17.915998141 +0000 UTC m=+502.421714788" lastFinishedPulling="2026-04-28 19:25:20.518461563 +0000 UTC m=+505.024178207" observedRunningTime="2026-04-28 19:25:20.64067365 +0000 UTC m=+505.146390315" watchObservedRunningTime="2026-04-28 19:25:20.642297307 +0000 UTC m=+505.148013977" Apr 28 19:25:31.627309 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:25:31.627273 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-79cf4cb497-8sztj" Apr 28 19:26:09.196973 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:09.196938 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-fz2s4"] Apr 28 19:26:09.201235 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:09.201218 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-fz2s4" Apr 28 19:26:09.204891 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:09.204868 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 28 19:26:09.205251 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:09.205235 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 28 19:26:09.205996 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:09.205979 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-9jcxm\"" Apr 28 19:26:09.206059 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:09.205979 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 28 19:26:09.216380 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:09.216349 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-fz2s4"] Apr 28 19:26:09.346556 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:09.346519 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spk67\" (UniqueName: \"kubernetes.io/projected/a607d0e0-237b-4251-b606-3b4e9e2db6c6-kube-api-access-spk67\") pod \"dns-operator-controller-manager-844548ff4c-fz2s4\" (UID: \"a607d0e0-237b-4251-b606-3b4e9e2db6c6\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-fz2s4" Apr 28 19:26:09.447373 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:09.447286 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-spk67\" (UniqueName: \"kubernetes.io/projected/a607d0e0-237b-4251-b606-3b4e9e2db6c6-kube-api-access-spk67\") pod \"dns-operator-controller-manager-844548ff4c-fz2s4\" (UID: \"a607d0e0-237b-4251-b606-3b4e9e2db6c6\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-fz2s4" Apr 28 19:26:09.458238 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:09.458208 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-spk67\" (UniqueName: \"kubernetes.io/projected/a607d0e0-237b-4251-b606-3b4e9e2db6c6-kube-api-access-spk67\") pod \"dns-operator-controller-manager-844548ff4c-fz2s4\" (UID: \"a607d0e0-237b-4251-b606-3b4e9e2db6c6\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-fz2s4" Apr 28 19:26:09.510805 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:09.510774 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-fz2s4" Apr 28 19:26:09.636734 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:09.636707 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-fz2s4"] Apr 28 19:26:09.638978 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:26:09.638944 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda607d0e0_237b_4251_b606_3b4e9e2db6c6.slice/crio-6c179785f51c69f329f0f06014639238ba1cdabd613e7bcdd08588a1aa459042 WatchSource:0}: Error finding container 6c179785f51c69f329f0f06014639238ba1cdabd613e7bcdd08588a1aa459042: Status 404 returned error can't find the container with id 6c179785f51c69f329f0f06014639238ba1cdabd613e7bcdd08588a1aa459042 Apr 28 19:26:09.760529 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:09.760501 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-fz2s4" event={"ID":"a607d0e0-237b-4251-b606-3b4e9e2db6c6","Type":"ContainerStarted","Data":"6c179785f51c69f329f0f06014639238ba1cdabd613e7bcdd08588a1aa459042"} Apr 28 19:26:11.578729 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:11.578691 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rtl6k"] Apr 28 19:26:11.581326 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:11.581303 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rtl6k" Apr 28 19:26:11.584202 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:11.584180 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-72qgd\"" Apr 28 19:26:11.598977 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:11.598950 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rtl6k"] Apr 28 19:26:11.664242 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:11.664206 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7h2k\" (UniqueName: \"kubernetes.io/projected/442d9435-e707-4a83-8ae3-039a186a940b-kube-api-access-p7h2k\") pod \"limitador-operator-controller-manager-c7fb4c8d5-rtl6k\" (UID: \"442d9435-e707-4a83-8ae3-039a186a940b\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rtl6k" Apr 28 19:26:11.765060 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:11.765026 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7h2k\" (UniqueName: \"kubernetes.io/projected/442d9435-e707-4a83-8ae3-039a186a940b-kube-api-access-p7h2k\") pod \"limitador-operator-controller-manager-c7fb4c8d5-rtl6k\" (UID: \"442d9435-e707-4a83-8ae3-039a186a940b\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rtl6k" Apr 28 19:26:11.776847 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:11.776820 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7h2k\" (UniqueName: \"kubernetes.io/projected/442d9435-e707-4a83-8ae3-039a186a940b-kube-api-access-p7h2k\") pod \"limitador-operator-controller-manager-c7fb4c8d5-rtl6k\" (UID: \"442d9435-e707-4a83-8ae3-039a186a940b\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rtl6k" Apr 28 19:26:11.891912 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:11.891833 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rtl6k" Apr 28 19:26:12.019523 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:12.019451 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rtl6k"] Apr 28 19:26:12.022579 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:26:12.022547 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod442d9435_e707_4a83_8ae3_039a186a940b.slice/crio-619999888c4694d59386e4b42189f88f7a74ead67988e36904cb70dafdfec163 WatchSource:0}: Error finding container 619999888c4694d59386e4b42189f88f7a74ead67988e36904cb70dafdfec163: Status 404 returned error can't find the container with id 619999888c4694d59386e4b42189f88f7a74ead67988e36904cb70dafdfec163 Apr 28 19:26:12.771645 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:12.771606 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rtl6k" event={"ID":"442d9435-e707-4a83-8ae3-039a186a940b","Type":"ContainerStarted","Data":"619999888c4694d59386e4b42189f88f7a74ead67988e36904cb70dafdfec163"} Apr 28 19:26:15.784637 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:15.784597 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rtl6k" event={"ID":"442d9435-e707-4a83-8ae3-039a186a940b","Type":"ContainerStarted","Data":"bca347a191084778a3d3f9dc8842c99b376fbee87a538e6becb78881985496b2"} Apr 28 19:26:15.785071 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:15.784744 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rtl6k" Apr 28 19:26:15.786018 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:15.785996 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-fz2s4" event={"ID":"a607d0e0-237b-4251-b606-3b4e9e2db6c6","Type":"ContainerStarted","Data":"87641bf5b92240ae198617dc7332d4a173d71265ec66061b24173253702dc534"} Apr 28 19:26:15.786172 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:15.786143 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-fz2s4" Apr 28 19:26:15.806098 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:15.806050 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rtl6k" podStartSLOduration=1.977767181 podStartE2EDuration="4.806038297s" podCreationTimestamp="2026-04-28 19:26:11 +0000 UTC" firstStartedPulling="2026-04-28 19:26:12.02475017 +0000 UTC m=+556.530466817" lastFinishedPulling="2026-04-28 19:26:14.853021289 +0000 UTC m=+559.358737933" observedRunningTime="2026-04-28 19:26:15.804670627 +0000 UTC m=+560.310387317" watchObservedRunningTime="2026-04-28 19:26:15.806038297 +0000 UTC m=+560.311754964" Apr 28 19:26:15.824456 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:15.824410 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-fz2s4" podStartSLOduration=1.6129346469999999 podStartE2EDuration="6.824397423s" podCreationTimestamp="2026-04-28 19:26:09 +0000 UTC" firstStartedPulling="2026-04-28 19:26:09.64104096 +0000 UTC m=+554.146757605" lastFinishedPulling="2026-04-28 19:26:14.852503737 +0000 UTC m=+559.358220381" observedRunningTime="2026-04-28 19:26:15.822883513 +0000 UTC m=+560.328600180" watchObservedRunningTime="2026-04-28 19:26:15.824397423 +0000 UTC m=+560.330114087" Apr 28 19:26:26.791652 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:26.791620 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-fz2s4" Apr 28 19:26:26.792033 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:26:26.791682 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rtl6k" Apr 28 19:27:03.936591 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:03.936557 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-9nvp7"] Apr 28 19:27:03.940862 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:03.940842 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-9nvp7" Apr 28 19:27:03.945387 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:03.945368 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 28 19:27:03.945510 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:03.945393 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-xgz8s\"" Apr 28 19:27:03.953219 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:03.953199 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-9nvp7"] Apr 28 19:27:03.971559 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:03.971530 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6dvt\" (UniqueName: \"kubernetes.io/projected/ff03eef9-9efd-42c9-8deb-112a6145a129-kube-api-access-q6dvt\") pod \"limitador-limitador-64c8f475fb-9nvp7\" (UID: \"ff03eef9-9efd-42c9-8deb-112a6145a129\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-9nvp7" Apr 28 19:27:03.971676 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:03.971613 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ff03eef9-9efd-42c9-8deb-112a6145a129-config-file\") pod \"limitador-limitador-64c8f475fb-9nvp7\" (UID: \"ff03eef9-9efd-42c9-8deb-112a6145a129\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-9nvp7" Apr 28 19:27:04.036086 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:04.036054 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-9nvp7"] Apr 28 19:27:04.073009 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:04.072977 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ff03eef9-9efd-42c9-8deb-112a6145a129-config-file\") pod \"limitador-limitador-64c8f475fb-9nvp7\" (UID: \"ff03eef9-9efd-42c9-8deb-112a6145a129\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-9nvp7" Apr 28 19:27:04.073149 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:04.073061 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6dvt\" (UniqueName: \"kubernetes.io/projected/ff03eef9-9efd-42c9-8deb-112a6145a129-kube-api-access-q6dvt\") pod \"limitador-limitador-64c8f475fb-9nvp7\" (UID: \"ff03eef9-9efd-42c9-8deb-112a6145a129\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-9nvp7" Apr 28 19:27:04.073624 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:04.073607 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ff03eef9-9efd-42c9-8deb-112a6145a129-config-file\") pod \"limitador-limitador-64c8f475fb-9nvp7\" (UID: \"ff03eef9-9efd-42c9-8deb-112a6145a129\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-9nvp7" Apr 28 19:27:04.082657 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:04.082633 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6dvt\" (UniqueName: \"kubernetes.io/projected/ff03eef9-9efd-42c9-8deb-112a6145a129-kube-api-access-q6dvt\") pod \"limitador-limitador-64c8f475fb-9nvp7\" (UID: \"ff03eef9-9efd-42c9-8deb-112a6145a129\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-9nvp7" Apr 28 19:27:04.250698 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:04.250619 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-9nvp7" Apr 28 19:27:04.394593 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:04.394568 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-9nvp7"] Apr 28 19:27:04.397231 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:27:04.397200 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff03eef9_9efd_42c9_8deb_112a6145a129.slice/crio-312fe0a6dff613a158ef03b28745741ee6282d961754d50a65d5c4ebe35e8d2e WatchSource:0}: Error finding container 312fe0a6dff613a158ef03b28745741ee6282d961754d50a65d5c4ebe35e8d2e: Status 404 returned error can't find the container with id 312fe0a6dff613a158ef03b28745741ee6282d961754d50a65d5c4ebe35e8d2e Apr 28 19:27:04.947647 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:04.947610 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-9nvp7" event={"ID":"ff03eef9-9efd-42c9-8deb-112a6145a129","Type":"ContainerStarted","Data":"312fe0a6dff613a158ef03b28745741ee6282d961754d50a65d5c4ebe35e8d2e"} Apr 28 19:27:05.014999 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:05.014970 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-zrqwv"] Apr 28 19:27:05.017333 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:05.017311 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-zrqwv" Apr 28 19:27:05.020308 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:05.020287 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-m9msd\"" Apr 28 19:27:05.026137 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:05.026115 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-zrqwv"] Apr 28 19:27:05.081319 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:05.081285 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvxlv\" (UniqueName: \"kubernetes.io/projected/b7480381-ac39-4da5-9fd8-54d3f30e601e-kube-api-access-dvxlv\") pod \"authorino-79cbc94b89-zrqwv\" (UID: \"b7480381-ac39-4da5-9fd8-54d3f30e601e\") " pod="kuadrant-system/authorino-79cbc94b89-zrqwv" Apr 28 19:27:05.182280 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:05.182240 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dvxlv\" (UniqueName: \"kubernetes.io/projected/b7480381-ac39-4da5-9fd8-54d3f30e601e-kube-api-access-dvxlv\") pod \"authorino-79cbc94b89-zrqwv\" (UID: \"b7480381-ac39-4da5-9fd8-54d3f30e601e\") " pod="kuadrant-system/authorino-79cbc94b89-zrqwv" Apr 28 19:27:05.198914 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:05.198849 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvxlv\" (UniqueName: \"kubernetes.io/projected/b7480381-ac39-4da5-9fd8-54d3f30e601e-kube-api-access-dvxlv\") pod \"authorino-79cbc94b89-zrqwv\" (UID: \"b7480381-ac39-4da5-9fd8-54d3f30e601e\") " pod="kuadrant-system/authorino-79cbc94b89-zrqwv" Apr 28 19:27:05.329977 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:05.329941 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-zrqwv" Apr 28 19:27:05.461866 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:05.461777 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-zrqwv"] Apr 28 19:27:05.464792 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:27:05.464765 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7480381_ac39_4da5_9fd8_54d3f30e601e.slice/crio-b66ef1192be82ee50de110ef8ac575e3a8d407bd801b490e9b1ce9f9b1d5919e WatchSource:0}: Error finding container b66ef1192be82ee50de110ef8ac575e3a8d407bd801b490e9b1ce9f9b1d5919e: Status 404 returned error can't find the container with id b66ef1192be82ee50de110ef8ac575e3a8d407bd801b490e9b1ce9f9b1d5919e Apr 28 19:27:05.951885 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:05.951855 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-zrqwv" event={"ID":"b7480381-ac39-4da5-9fd8-54d3f30e601e","Type":"ContainerStarted","Data":"b66ef1192be82ee50de110ef8ac575e3a8d407bd801b490e9b1ce9f9b1d5919e"} Apr 28 19:27:09.968505 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:09.968407 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-zrqwv" event={"ID":"b7480381-ac39-4da5-9fd8-54d3f30e601e","Type":"ContainerStarted","Data":"a81a4e3ea2801cd924e11e582f0fc1a719aa4dd7f3a8b2be139f905145db4a64"} Apr 28 19:27:09.969800 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:09.969770 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-9nvp7" event={"ID":"ff03eef9-9efd-42c9-8deb-112a6145a129","Type":"ContainerStarted","Data":"6fdcd1b1a06de31030020e552032d8180c342d573fd75c1ebe441e0c1fef58e8"} Apr 28 19:27:09.969929 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:09.969886 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-9nvp7" Apr 28 19:27:10.036945 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:10.036880 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-9nvp7" podStartSLOduration=1.7387508120000001 podStartE2EDuration="7.036860411s" podCreationTimestamp="2026-04-28 19:27:03 +0000 UTC" firstStartedPulling="2026-04-28 19:27:04.398949367 +0000 UTC m=+608.904666027" lastFinishedPulling="2026-04-28 19:27:09.697058982 +0000 UTC m=+614.202775626" observedRunningTime="2026-04-28 19:27:10.036672493 +0000 UTC m=+614.542389160" watchObservedRunningTime="2026-04-28 19:27:10.036860411 +0000 UTC m=+614.542577080" Apr 28 19:27:10.037397 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:10.037357 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-zrqwv" podStartSLOduration=0.797431864 podStartE2EDuration="5.037347553s" podCreationTimestamp="2026-04-28 19:27:05 +0000 UTC" firstStartedPulling="2026-04-28 19:27:05.466476546 +0000 UTC m=+609.972193205" lastFinishedPulling="2026-04-28 19:27:09.706392235 +0000 UTC m=+614.212108894" observedRunningTime="2026-04-28 19:27:09.999041516 +0000 UTC m=+614.504758181" watchObservedRunningTime="2026-04-28 19:27:10.037347553 +0000 UTC m=+614.543064223" Apr 28 19:27:18.033076 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:18.033037 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-9nvp7"] Apr 28 19:27:18.034612 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:18.033324 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-9nvp7" podUID="ff03eef9-9efd-42c9-8deb-112a6145a129" containerName="limitador" containerID="cri-o://6fdcd1b1a06de31030020e552032d8180c342d573fd75c1ebe441e0c1fef58e8" gracePeriod=30 Apr 28 19:27:18.034612 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:18.033949 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-9nvp7" Apr 28 19:27:18.567017 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:18.566991 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-9nvp7" Apr 28 19:27:18.698592 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:18.698511 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ff03eef9-9efd-42c9-8deb-112a6145a129-config-file\") pod \"ff03eef9-9efd-42c9-8deb-112a6145a129\" (UID: \"ff03eef9-9efd-42c9-8deb-112a6145a129\") " Apr 28 19:27:18.698592 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:18.698592 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6dvt\" (UniqueName: \"kubernetes.io/projected/ff03eef9-9efd-42c9-8deb-112a6145a129-kube-api-access-q6dvt\") pod \"ff03eef9-9efd-42c9-8deb-112a6145a129\" (UID: \"ff03eef9-9efd-42c9-8deb-112a6145a129\") " Apr 28 19:27:18.698887 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:18.698863 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff03eef9-9efd-42c9-8deb-112a6145a129-config-file" (OuterVolumeSpecName: "config-file") pod "ff03eef9-9efd-42c9-8deb-112a6145a129" (UID: "ff03eef9-9efd-42c9-8deb-112a6145a129"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:27:18.700819 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:18.700796 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff03eef9-9efd-42c9-8deb-112a6145a129-kube-api-access-q6dvt" (OuterVolumeSpecName: "kube-api-access-q6dvt") pod "ff03eef9-9efd-42c9-8deb-112a6145a129" (UID: "ff03eef9-9efd-42c9-8deb-112a6145a129"). InnerVolumeSpecName "kube-api-access-q6dvt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:27:18.799422 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:18.799384 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q6dvt\" (UniqueName: \"kubernetes.io/projected/ff03eef9-9efd-42c9-8deb-112a6145a129-kube-api-access-q6dvt\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:27:18.799422 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:18.799415 2565 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ff03eef9-9efd-42c9-8deb-112a6145a129-config-file\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:27:19.001410 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:19.001332 2565 generic.go:358] "Generic (PLEG): container finished" podID="ff03eef9-9efd-42c9-8deb-112a6145a129" containerID="6fdcd1b1a06de31030020e552032d8180c342d573fd75c1ebe441e0c1fef58e8" exitCode=0 Apr 28 19:27:19.001410 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:19.001390 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-9nvp7" Apr 28 19:27:19.001624 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:19.001414 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-9nvp7" event={"ID":"ff03eef9-9efd-42c9-8deb-112a6145a129","Type":"ContainerDied","Data":"6fdcd1b1a06de31030020e552032d8180c342d573fd75c1ebe441e0c1fef58e8"} Apr 28 19:27:19.001624 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:19.001445 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-9nvp7" event={"ID":"ff03eef9-9efd-42c9-8deb-112a6145a129","Type":"ContainerDied","Data":"312fe0a6dff613a158ef03b28745741ee6282d961754d50a65d5c4ebe35e8d2e"} Apr 28 19:27:19.001624 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:19.001460 2565 scope.go:117] "RemoveContainer" containerID="6fdcd1b1a06de31030020e552032d8180c342d573fd75c1ebe441e0c1fef58e8" Apr 28 19:27:19.009268 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:19.009251 2565 scope.go:117] "RemoveContainer" containerID="6fdcd1b1a06de31030020e552032d8180c342d573fd75c1ebe441e0c1fef58e8" Apr 28 19:27:19.009513 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:27:19.009492 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fdcd1b1a06de31030020e552032d8180c342d573fd75c1ebe441e0c1fef58e8\": container with ID starting with 6fdcd1b1a06de31030020e552032d8180c342d573fd75c1ebe441e0c1fef58e8 not found: ID does not exist" containerID="6fdcd1b1a06de31030020e552032d8180c342d573fd75c1ebe441e0c1fef58e8" Apr 28 19:27:19.009590 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:19.009526 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fdcd1b1a06de31030020e552032d8180c342d573fd75c1ebe441e0c1fef58e8"} err="failed to get container status \"6fdcd1b1a06de31030020e552032d8180c342d573fd75c1ebe441e0c1fef58e8\": rpc error: code = NotFound desc = could not find container \"6fdcd1b1a06de31030020e552032d8180c342d573fd75c1ebe441e0c1fef58e8\": container with ID starting with 6fdcd1b1a06de31030020e552032d8180c342d573fd75c1ebe441e0c1fef58e8 not found: ID does not exist" Apr 28 19:27:19.022848 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:19.022817 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-9nvp7"] Apr 28 19:27:19.024272 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:19.024255 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-9nvp7"] Apr 28 19:27:20.110671 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:20.110638 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff03eef9-9efd-42c9-8deb-112a6145a129" path="/var/lib/kubelet/pods/ff03eef9-9efd-42c9-8deb-112a6145a129/volumes" Apr 28 19:27:28.216916 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:28.216881 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-zrqwv"] Apr 28 19:27:28.217347 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:28.217084 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-zrqwv" podUID="b7480381-ac39-4da5-9fd8-54d3f30e601e" containerName="authorino" containerID="cri-o://a81a4e3ea2801cd924e11e582f0fc1a719aa4dd7f3a8b2be139f905145db4a64" gracePeriod=30 Apr 28 19:27:28.455237 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:28.455212 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-zrqwv" Apr 28 19:27:28.581862 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:28.581835 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvxlv\" (UniqueName: \"kubernetes.io/projected/b7480381-ac39-4da5-9fd8-54d3f30e601e-kube-api-access-dvxlv\") pod \"b7480381-ac39-4da5-9fd8-54d3f30e601e\" (UID: \"b7480381-ac39-4da5-9fd8-54d3f30e601e\") " Apr 28 19:27:28.583751 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:28.583726 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7480381-ac39-4da5-9fd8-54d3f30e601e-kube-api-access-dvxlv" (OuterVolumeSpecName: "kube-api-access-dvxlv") pod "b7480381-ac39-4da5-9fd8-54d3f30e601e" (UID: "b7480381-ac39-4da5-9fd8-54d3f30e601e"). InnerVolumeSpecName "kube-api-access-dvxlv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:27:28.683001 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:28.682965 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dvxlv\" (UniqueName: \"kubernetes.io/projected/b7480381-ac39-4da5-9fd8-54d3f30e601e-kube-api-access-dvxlv\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:27:29.035501 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:29.035463 2565 generic.go:358] "Generic (PLEG): container finished" podID="b7480381-ac39-4da5-9fd8-54d3f30e601e" containerID="a81a4e3ea2801cd924e11e582f0fc1a719aa4dd7f3a8b2be139f905145db4a64" exitCode=0 Apr 28 19:27:29.035649 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:29.035527 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-zrqwv" Apr 28 19:27:29.035649 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:29.035550 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-zrqwv" event={"ID":"b7480381-ac39-4da5-9fd8-54d3f30e601e","Type":"ContainerDied","Data":"a81a4e3ea2801cd924e11e582f0fc1a719aa4dd7f3a8b2be139f905145db4a64"} Apr 28 19:27:29.035649 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:29.035589 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-zrqwv" event={"ID":"b7480381-ac39-4da5-9fd8-54d3f30e601e","Type":"ContainerDied","Data":"b66ef1192be82ee50de110ef8ac575e3a8d407bd801b490e9b1ce9f9b1d5919e"} Apr 28 19:27:29.035649 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:29.035604 2565 scope.go:117] "RemoveContainer" containerID="a81a4e3ea2801cd924e11e582f0fc1a719aa4dd7f3a8b2be139f905145db4a64" Apr 28 19:27:29.043969 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:29.043952 2565 scope.go:117] "RemoveContainer" containerID="a81a4e3ea2801cd924e11e582f0fc1a719aa4dd7f3a8b2be139f905145db4a64" Apr 28 19:27:29.044226 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:27:29.044207 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a81a4e3ea2801cd924e11e582f0fc1a719aa4dd7f3a8b2be139f905145db4a64\": container with ID starting with a81a4e3ea2801cd924e11e582f0fc1a719aa4dd7f3a8b2be139f905145db4a64 not found: ID does not exist" containerID="a81a4e3ea2801cd924e11e582f0fc1a719aa4dd7f3a8b2be139f905145db4a64" Apr 28 19:27:29.044308 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:29.044233 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a81a4e3ea2801cd924e11e582f0fc1a719aa4dd7f3a8b2be139f905145db4a64"} err="failed to get container status \"a81a4e3ea2801cd924e11e582f0fc1a719aa4dd7f3a8b2be139f905145db4a64\": rpc error: code = NotFound desc = could not find container \"a81a4e3ea2801cd924e11e582f0fc1a719aa4dd7f3a8b2be139f905145db4a64\": container with ID starting with a81a4e3ea2801cd924e11e582f0fc1a719aa4dd7f3a8b2be139f905145db4a64 not found: ID does not exist" Apr 28 19:27:29.059805 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:29.059782 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-zrqwv"] Apr 28 19:27:29.070465 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:29.070442 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-zrqwv"] Apr 28 19:27:30.111234 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:30.111193 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7480381-ac39-4da5-9fd8-54d3f30e601e" path="/var/lib/kubelet/pods/b7480381-ac39-4da5-9fd8-54d3f30e601e/volumes" Apr 28 19:27:46.304467 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.304421 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-b85c69797-2xbq4"] Apr 28 19:27:46.304902 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.304766 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b7480381-ac39-4da5-9fd8-54d3f30e601e" containerName="authorino" Apr 28 19:27:46.304902 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.304777 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7480381-ac39-4da5-9fd8-54d3f30e601e" containerName="authorino" Apr 28 19:27:46.304902 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.304786 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff03eef9-9efd-42c9-8deb-112a6145a129" containerName="limitador" Apr 28 19:27:46.304902 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.304791 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff03eef9-9efd-42c9-8deb-112a6145a129" containerName="limitador" Apr 28 19:27:46.304902 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.304853 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="b7480381-ac39-4da5-9fd8-54d3f30e601e" containerName="authorino" Apr 28 19:27:46.304902 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.304864 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="ff03eef9-9efd-42c9-8deb-112a6145a129" containerName="limitador" Apr 28 19:27:46.306664 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.306649 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b85c69797-2xbq4" Apr 28 19:27:46.309232 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.309202 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 28 19:27:46.309379 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.309267 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 28 19:27:46.309448 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.309430 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 28 19:27:46.310617 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.310600 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-rzrtl\"" Apr 28 19:27:46.317347 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.317323 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-2xbq4"] Apr 28 19:27:46.325587 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.325562 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-65594cb6f6-dfc8f"] Apr 28 19:27:46.327863 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.327841 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-65594cb6f6-dfc8f" Apr 28 19:27:46.330671 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.330652 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 28 19:27:46.330789 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.330773 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-hbt5m\"" Apr 28 19:27:46.338400 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.338380 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwmdx\" (UniqueName: \"kubernetes.io/projected/26e4f025-832e-4ff0-836e-75fd5c697734-kube-api-access-wwmdx\") pod \"kserve-controller-manager-b85c69797-2xbq4\" (UID: \"26e4f025-832e-4ff0-836e-75fd5c697734\") " pod="kserve/kserve-controller-manager-b85c69797-2xbq4" Apr 28 19:27:46.338496 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.338407 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b00c990-4a99-4916-8902-69f0f8865190-cert\") pod \"llmisvc-controller-manager-65594cb6f6-dfc8f\" (UID: \"5b00c990-4a99-4916-8902-69f0f8865190\") " pod="kserve/llmisvc-controller-manager-65594cb6f6-dfc8f" Apr 28 19:27:46.338496 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.338433 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26e4f025-832e-4ff0-836e-75fd5c697734-cert\") pod \"kserve-controller-manager-b85c69797-2xbq4\" (UID: \"26e4f025-832e-4ff0-836e-75fd5c697734\") " pod="kserve/kserve-controller-manager-b85c69797-2xbq4" Apr 28 19:27:46.338619 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.338597 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpvpb\" (UniqueName: \"kubernetes.io/projected/5b00c990-4a99-4916-8902-69f0f8865190-kube-api-access-tpvpb\") pod \"llmisvc-controller-manager-65594cb6f6-dfc8f\" (UID: \"5b00c990-4a99-4916-8902-69f0f8865190\") " pod="kserve/llmisvc-controller-manager-65594cb6f6-dfc8f" Apr 28 19:27:46.342143 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.342120 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-65594cb6f6-dfc8f"] Apr 28 19:27:46.439543 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.439511 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tpvpb\" (UniqueName: \"kubernetes.io/projected/5b00c990-4a99-4916-8902-69f0f8865190-kube-api-access-tpvpb\") pod \"llmisvc-controller-manager-65594cb6f6-dfc8f\" (UID: \"5b00c990-4a99-4916-8902-69f0f8865190\") " pod="kserve/llmisvc-controller-manager-65594cb6f6-dfc8f" Apr 28 19:27:46.439719 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.439652 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwmdx\" (UniqueName: \"kubernetes.io/projected/26e4f025-832e-4ff0-836e-75fd5c697734-kube-api-access-wwmdx\") pod \"kserve-controller-manager-b85c69797-2xbq4\" (UID: \"26e4f025-832e-4ff0-836e-75fd5c697734\") " pod="kserve/kserve-controller-manager-b85c69797-2xbq4" Apr 28 19:27:46.439719 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.439673 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b00c990-4a99-4916-8902-69f0f8865190-cert\") pod \"llmisvc-controller-manager-65594cb6f6-dfc8f\" (UID: \"5b00c990-4a99-4916-8902-69f0f8865190\") " pod="kserve/llmisvc-controller-manager-65594cb6f6-dfc8f" Apr 28 19:27:46.439719 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.439690 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26e4f025-832e-4ff0-836e-75fd5c697734-cert\") pod \"kserve-controller-manager-b85c69797-2xbq4\" (UID: \"26e4f025-832e-4ff0-836e-75fd5c697734\") " pod="kserve/kserve-controller-manager-b85c69797-2xbq4" Apr 28 19:27:46.439882 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:27:46.439813 2565 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 28 19:27:46.439882 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:27:46.439847 2565 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 28 19:27:46.439980 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:27:46.439901 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b00c990-4a99-4916-8902-69f0f8865190-cert podName:5b00c990-4a99-4916-8902-69f0f8865190 nodeName:}" failed. No retries permitted until 2026-04-28 19:27:46.939878128 +0000 UTC m=+651.445594783 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5b00c990-4a99-4916-8902-69f0f8865190-cert") pod "llmisvc-controller-manager-65594cb6f6-dfc8f" (UID: "5b00c990-4a99-4916-8902-69f0f8865190") : secret "llmisvc-webhook-server-cert" not found Apr 28 19:27:46.439980 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:27:46.439922 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26e4f025-832e-4ff0-836e-75fd5c697734-cert podName:26e4f025-832e-4ff0-836e-75fd5c697734 nodeName:}" failed. No retries permitted until 2026-04-28 19:27:46.939913479 +0000 UTC m=+651.445630124 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26e4f025-832e-4ff0-836e-75fd5c697734-cert") pod "kserve-controller-manager-b85c69797-2xbq4" (UID: "26e4f025-832e-4ff0-836e-75fd5c697734") : secret "kserve-webhook-server-cert" not found Apr 28 19:27:46.448751 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.448723 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwmdx\" (UniqueName: \"kubernetes.io/projected/26e4f025-832e-4ff0-836e-75fd5c697734-kube-api-access-wwmdx\") pod \"kserve-controller-manager-b85c69797-2xbq4\" (UID: \"26e4f025-832e-4ff0-836e-75fd5c697734\") " pod="kserve/kserve-controller-manager-b85c69797-2xbq4" Apr 28 19:27:46.451800 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.451773 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpvpb\" (UniqueName: \"kubernetes.io/projected/5b00c990-4a99-4916-8902-69f0f8865190-kube-api-access-tpvpb\") pod \"llmisvc-controller-manager-65594cb6f6-dfc8f\" (UID: \"5b00c990-4a99-4916-8902-69f0f8865190\") " pod="kserve/llmisvc-controller-manager-65594cb6f6-dfc8f" Apr 28 19:27:46.944455 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.944424 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b00c990-4a99-4916-8902-69f0f8865190-cert\") pod \"llmisvc-controller-manager-65594cb6f6-dfc8f\" (UID: \"5b00c990-4a99-4916-8902-69f0f8865190\") " pod="kserve/llmisvc-controller-manager-65594cb6f6-dfc8f" Apr 28 19:27:46.944455 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.944469 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26e4f025-832e-4ff0-836e-75fd5c697734-cert\") pod \"kserve-controller-manager-b85c69797-2xbq4\" (UID: \"26e4f025-832e-4ff0-836e-75fd5c697734\") " pod="kserve/kserve-controller-manager-b85c69797-2xbq4" Apr 28 19:27:46.944693 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:27:46.944567 2565 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 28 19:27:46.944693 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:27:46.944633 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b00c990-4a99-4916-8902-69f0f8865190-cert podName:5b00c990-4a99-4916-8902-69f0f8865190 nodeName:}" failed. No retries permitted until 2026-04-28 19:27:47.944617119 +0000 UTC m=+652.450333763 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5b00c990-4a99-4916-8902-69f0f8865190-cert") pod "llmisvc-controller-manager-65594cb6f6-dfc8f" (UID: "5b00c990-4a99-4916-8902-69f0f8865190") : secret "llmisvc-webhook-server-cert" not found Apr 28 19:27:46.946891 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:46.946874 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26e4f025-832e-4ff0-836e-75fd5c697734-cert\") pod \"kserve-controller-manager-b85c69797-2xbq4\" (UID: \"26e4f025-832e-4ff0-836e-75fd5c697734\") " pod="kserve/kserve-controller-manager-b85c69797-2xbq4" Apr 28 19:27:47.217913 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:47.217810 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b85c69797-2xbq4" Apr 28 19:27:47.342867 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:47.342835 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-2xbq4"] Apr 28 19:27:47.345658 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:27:47.345632 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26e4f025_832e_4ff0_836e_75fd5c697734.slice/crio-1c92538399f6a2fed3c42e6dfb86532a34382d49d524dbe6b0b653f4a1a7c1e4 WatchSource:0}: Error finding container 1c92538399f6a2fed3c42e6dfb86532a34382d49d524dbe6b0b653f4a1a7c1e4: Status 404 returned error can't find the container with id 1c92538399f6a2fed3c42e6dfb86532a34382d49d524dbe6b0b653f4a1a7c1e4 Apr 28 19:27:47.346908 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:47.346884 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:27:47.953726 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:47.953690 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b00c990-4a99-4916-8902-69f0f8865190-cert\") pod \"llmisvc-controller-manager-65594cb6f6-dfc8f\" (UID: \"5b00c990-4a99-4916-8902-69f0f8865190\") " pod="kserve/llmisvc-controller-manager-65594cb6f6-dfc8f" Apr 28 19:27:47.956739 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:47.956711 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b00c990-4a99-4916-8902-69f0f8865190-cert\") pod \"llmisvc-controller-manager-65594cb6f6-dfc8f\" (UID: \"5b00c990-4a99-4916-8902-69f0f8865190\") " pod="kserve/llmisvc-controller-manager-65594cb6f6-dfc8f" Apr 28 19:27:48.104521 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:48.104475 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b85c69797-2xbq4" event={"ID":"26e4f025-832e-4ff0-836e-75fd5c697734","Type":"ContainerStarted","Data":"1c92538399f6a2fed3c42e6dfb86532a34382d49d524dbe6b0b653f4a1a7c1e4"} Apr 28 19:27:48.138867 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:48.138830 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-65594cb6f6-dfc8f" Apr 28 19:27:48.294048 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:48.293888 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-65594cb6f6-dfc8f"] Apr 28 19:27:48.297180 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:27:48.297133 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5b00c990_4a99_4916_8902_69f0f8865190.slice/crio-0dfd673ac36423bede85b21d552ef3c97ce57b8e5d697afc7bd163d976e421e8 WatchSource:0}: Error finding container 0dfd673ac36423bede85b21d552ef3c97ce57b8e5d697afc7bd163d976e421e8: Status 404 returned error can't find the container with id 0dfd673ac36423bede85b21d552ef3c97ce57b8e5d697afc7bd163d976e421e8 Apr 28 19:27:49.110609 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:49.110578 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-65594cb6f6-dfc8f" event={"ID":"5b00c990-4a99-4916-8902-69f0f8865190","Type":"ContainerStarted","Data":"0dfd673ac36423bede85b21d552ef3c97ce57b8e5d697afc7bd163d976e421e8"} Apr 28 19:27:50.114930 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:50.114891 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b85c69797-2xbq4" event={"ID":"26e4f025-832e-4ff0-836e-75fd5c697734","Type":"ContainerStarted","Data":"cdbb34e98c39614f23cce8656c7f067d475ff7eed3ea49beb4af2b41a8b98a11"} Apr 28 19:27:50.115316 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:50.115027 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-b85c69797-2xbq4" Apr 28 19:27:50.133272 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:50.133226 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-b85c69797-2xbq4" podStartSLOduration=1.798784522 podStartE2EDuration="4.133210804s" podCreationTimestamp="2026-04-28 19:27:46 +0000 UTC" firstStartedPulling="2026-04-28 19:27:47.347004663 +0000 UTC m=+651.852721307" lastFinishedPulling="2026-04-28 19:27:49.681430945 +0000 UTC m=+654.187147589" observedRunningTime="2026-04-28 19:27:50.131729089 +0000 UTC m=+654.637445755" watchObservedRunningTime="2026-04-28 19:27:50.133210804 +0000 UTC m=+654.638927470" Apr 28 19:27:53.125980 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:53.125941 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-65594cb6f6-dfc8f" event={"ID":"5b00c990-4a99-4916-8902-69f0f8865190","Type":"ContainerStarted","Data":"20592bed7893768a6c0eed6320110221dd9631efc78d57d5032dc95f0a5a4b60"} Apr 28 19:27:53.126577 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:53.126001 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-65594cb6f6-dfc8f" Apr 28 19:27:53.144108 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:27:53.144056 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-65594cb6f6-dfc8f" podStartSLOduration=3.077693443 podStartE2EDuration="7.144042597s" podCreationTimestamp="2026-04-28 19:27:46 +0000 UTC" firstStartedPulling="2026-04-28 19:27:48.299289818 +0000 UTC m=+652.805006476" lastFinishedPulling="2026-04-28 19:27:52.365638965 +0000 UTC m=+656.871355630" observedRunningTime="2026-04-28 19:27:53.14204893 +0000 UTC m=+657.647765597" watchObservedRunningTime="2026-04-28 19:27:53.144042597 +0000 UTC m=+657.649759263" Apr 28 19:28:21.122873 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:21.122837 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-b85c69797-2xbq4" Apr 28 19:28:24.131087 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:24.131058 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-65594cb6f6-dfc8f" Apr 28 19:28:25.565487 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:25.565452 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-2xbq4"] Apr 28 19:28:25.565880 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:25.565692 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-b85c69797-2xbq4" podUID="26e4f025-832e-4ff0-836e-75fd5c697734" containerName="manager" containerID="cri-o://cdbb34e98c39614f23cce8656c7f067d475ff7eed3ea49beb4af2b41a8b98a11" gracePeriod=10 Apr 28 19:28:25.595040 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:25.595015 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-b85c69797-wpbd6"] Apr 28 19:28:25.597482 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:25.597463 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b85c69797-wpbd6" Apr 28 19:28:25.608765 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:25.608744 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-wpbd6"] Apr 28 19:28:25.788126 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:25.788097 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/36820a29-0121-440b-9c92-de28da74677f-cert\") pod \"kserve-controller-manager-b85c69797-wpbd6\" (UID: \"36820a29-0121-440b-9c92-de28da74677f\") " pod="kserve/kserve-controller-manager-b85c69797-wpbd6" Apr 28 19:28:25.788309 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:25.788149 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnh7l\" (UniqueName: \"kubernetes.io/projected/36820a29-0121-440b-9c92-de28da74677f-kube-api-access-tnh7l\") pod \"kserve-controller-manager-b85c69797-wpbd6\" (UID: \"36820a29-0121-440b-9c92-de28da74677f\") " pod="kserve/kserve-controller-manager-b85c69797-wpbd6" Apr 28 19:28:25.804720 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:25.804700 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b85c69797-2xbq4" Apr 28 19:28:25.888676 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:25.888598 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/36820a29-0121-440b-9c92-de28da74677f-cert\") pod \"kserve-controller-manager-b85c69797-wpbd6\" (UID: \"36820a29-0121-440b-9c92-de28da74677f\") " pod="kserve/kserve-controller-manager-b85c69797-wpbd6" Apr 28 19:28:25.888676 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:25.888641 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnh7l\" (UniqueName: \"kubernetes.io/projected/36820a29-0121-440b-9c92-de28da74677f-kube-api-access-tnh7l\") pod \"kserve-controller-manager-b85c69797-wpbd6\" (UID: \"36820a29-0121-440b-9c92-de28da74677f\") " pod="kserve/kserve-controller-manager-b85c69797-wpbd6" Apr 28 19:28:25.890906 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:25.890888 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/36820a29-0121-440b-9c92-de28da74677f-cert\") pod \"kserve-controller-manager-b85c69797-wpbd6\" (UID: \"36820a29-0121-440b-9c92-de28da74677f\") " pod="kserve/kserve-controller-manager-b85c69797-wpbd6" Apr 28 19:28:25.898727 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:25.898705 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnh7l\" (UniqueName: \"kubernetes.io/projected/36820a29-0121-440b-9c92-de28da74677f-kube-api-access-tnh7l\") pod \"kserve-controller-manager-b85c69797-wpbd6\" (UID: \"36820a29-0121-440b-9c92-de28da74677f\") " pod="kserve/kserve-controller-manager-b85c69797-wpbd6" Apr 28 19:28:25.947877 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:25.947825 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b85c69797-wpbd6" Apr 28 19:28:25.990018 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:25.989979 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwmdx\" (UniqueName: \"kubernetes.io/projected/26e4f025-832e-4ff0-836e-75fd5c697734-kube-api-access-wwmdx\") pod \"26e4f025-832e-4ff0-836e-75fd5c697734\" (UID: \"26e4f025-832e-4ff0-836e-75fd5c697734\") " Apr 28 19:28:25.990018 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:25.990014 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26e4f025-832e-4ff0-836e-75fd5c697734-cert\") pod \"26e4f025-832e-4ff0-836e-75fd5c697734\" (UID: \"26e4f025-832e-4ff0-836e-75fd5c697734\") " Apr 28 19:28:25.994270 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:25.993945 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e4f025-832e-4ff0-836e-75fd5c697734-cert" (OuterVolumeSpecName: "cert") pod "26e4f025-832e-4ff0-836e-75fd5c697734" (UID: "26e4f025-832e-4ff0-836e-75fd5c697734"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:28:25.994270 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:25.994142 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26e4f025-832e-4ff0-836e-75fd5c697734-kube-api-access-wwmdx" (OuterVolumeSpecName: "kube-api-access-wwmdx") pod "26e4f025-832e-4ff0-836e-75fd5c697734" (UID: "26e4f025-832e-4ff0-836e-75fd5c697734"). InnerVolumeSpecName "kube-api-access-wwmdx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:28:26.069213 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:26.069183 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-wpbd6"] Apr 28 19:28:26.071852 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:28:26.071826 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36820a29_0121_440b_9c92_de28da74677f.slice/crio-1045fe658ced2254e9ee85a31a932d25a172903a6649cd79be583b9cadd6cbab WatchSource:0}: Error finding container 1045fe658ced2254e9ee85a31a932d25a172903a6649cd79be583b9cadd6cbab: Status 404 returned error can't find the container with id 1045fe658ced2254e9ee85a31a932d25a172903a6649cd79be583b9cadd6cbab Apr 28 19:28:26.090916 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:26.090891 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wwmdx\" (UniqueName: \"kubernetes.io/projected/26e4f025-832e-4ff0-836e-75fd5c697734-kube-api-access-wwmdx\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:28:26.090916 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:26.090911 2565 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26e4f025-832e-4ff0-836e-75fd5c697734-cert\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:28:26.230927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:26.230836 2565 generic.go:358] "Generic (PLEG): container finished" podID="26e4f025-832e-4ff0-836e-75fd5c697734" containerID="cdbb34e98c39614f23cce8656c7f067d475ff7eed3ea49beb4af2b41a8b98a11" exitCode=0 Apr 28 19:28:26.230927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:26.230905 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b85c69797-2xbq4" Apr 28 19:28:26.230927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:26.230916 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b85c69797-2xbq4" event={"ID":"26e4f025-832e-4ff0-836e-75fd5c697734","Type":"ContainerDied","Data":"cdbb34e98c39614f23cce8656c7f067d475ff7eed3ea49beb4af2b41a8b98a11"} Apr 28 19:28:26.231238 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:26.230949 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b85c69797-2xbq4" event={"ID":"26e4f025-832e-4ff0-836e-75fd5c697734","Type":"ContainerDied","Data":"1c92538399f6a2fed3c42e6dfb86532a34382d49d524dbe6b0b653f4a1a7c1e4"} Apr 28 19:28:26.231238 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:26.230968 2565 scope.go:117] "RemoveContainer" containerID="cdbb34e98c39614f23cce8656c7f067d475ff7eed3ea49beb4af2b41a8b98a11" Apr 28 19:28:26.238263 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:26.238224 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b85c69797-wpbd6" event={"ID":"36820a29-0121-440b-9c92-de28da74677f","Type":"ContainerStarted","Data":"1045fe658ced2254e9ee85a31a932d25a172903a6649cd79be583b9cadd6cbab"} Apr 28 19:28:26.244890 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:26.244867 2565 scope.go:117] "RemoveContainer" containerID="cdbb34e98c39614f23cce8656c7f067d475ff7eed3ea49beb4af2b41a8b98a11" Apr 28 19:28:26.245142 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:28:26.245123 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdbb34e98c39614f23cce8656c7f067d475ff7eed3ea49beb4af2b41a8b98a11\": container with ID starting with cdbb34e98c39614f23cce8656c7f067d475ff7eed3ea49beb4af2b41a8b98a11 not found: ID does not exist" containerID="cdbb34e98c39614f23cce8656c7f067d475ff7eed3ea49beb4af2b41a8b98a11" Apr 28 19:28:26.245277 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:26.245151 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdbb34e98c39614f23cce8656c7f067d475ff7eed3ea49beb4af2b41a8b98a11"} err="failed to get container status \"cdbb34e98c39614f23cce8656c7f067d475ff7eed3ea49beb4af2b41a8b98a11\": rpc error: code = NotFound desc = could not find container \"cdbb34e98c39614f23cce8656c7f067d475ff7eed3ea49beb4af2b41a8b98a11\": container with ID starting with cdbb34e98c39614f23cce8656c7f067d475ff7eed3ea49beb4af2b41a8b98a11 not found: ID does not exist" Apr 28 19:28:26.248338 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:26.248315 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-2xbq4"] Apr 28 19:28:26.251663 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:26.251644 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-2xbq4"] Apr 28 19:28:27.243389 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:27.243347 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b85c69797-wpbd6" event={"ID":"36820a29-0121-440b-9c92-de28da74677f","Type":"ContainerStarted","Data":"f082fe1c33a8b03d688e5239a3584669319ce12ffae014b6819efe04cb81ef09"} Apr 28 19:28:27.243758 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:27.243500 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-b85c69797-wpbd6" Apr 28 19:28:27.261685 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:27.261638 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-b85c69797-wpbd6" podStartSLOduration=1.862859802 podStartE2EDuration="2.26162421s" podCreationTimestamp="2026-04-28 19:28:25 +0000 UTC" firstStartedPulling="2026-04-28 19:28:26.073112947 +0000 UTC m=+690.578829605" lastFinishedPulling="2026-04-28 19:28:26.471877364 +0000 UTC m=+690.977594013" observedRunningTime="2026-04-28 19:28:27.260099099 +0000 UTC m=+691.765815775" watchObservedRunningTime="2026-04-28 19:28:27.26162421 +0000 UTC m=+691.767340877" Apr 28 19:28:28.110972 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:28.110939 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26e4f025-832e-4ff0-836e-75fd5c697734" path="/var/lib/kubelet/pods/26e4f025-832e-4ff0-836e-75fd5c697734/volumes" Apr 28 19:28:58.251856 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:28:58.251818 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-b85c69797-wpbd6" Apr 28 19:29:35.497179 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.497121 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9"] Apr 28 19:29:35.497658 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.497647 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26e4f025-832e-4ff0-836e-75fd5c697734" containerName="manager" Apr 28 19:29:35.497716 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.497665 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="26e4f025-832e-4ff0-836e-75fd5c697734" containerName="manager" Apr 28 19:29:35.497773 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.497755 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="26e4f025-832e-4ff0-836e-75fd5c697734" containerName="manager" Apr 28 19:29:35.500266 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.500240 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.503193 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.503135 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 28 19:29:35.503455 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.503149 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-rzjtc\"" Apr 28 19:29:35.503717 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.503699 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 28 19:29:35.504216 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.504194 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 28 19:29:35.519617 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.519592 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9"] Apr 28 19:29:35.599308 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.599278 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/cf96fe9c-423f-4e83-9e21-3a1128dc1f55-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-s56s9\" (UID: \"cf96fe9c-423f-4e83-9e21-3a1128dc1f55\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.599491 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.599315 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/cf96fe9c-423f-4e83-9e21-3a1128dc1f55-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-s56s9\" (UID: \"cf96fe9c-423f-4e83-9e21-3a1128dc1f55\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.599491 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.599358 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cf96fe9c-423f-4e83-9e21-3a1128dc1f55-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-s56s9\" (UID: \"cf96fe9c-423f-4e83-9e21-3a1128dc1f55\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.599491 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.599392 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/cf96fe9c-423f-4e83-9e21-3a1128dc1f55-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-s56s9\" (UID: \"cf96fe9c-423f-4e83-9e21-3a1128dc1f55\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.599491 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.599408 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/cf96fe9c-423f-4e83-9e21-3a1128dc1f55-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-s56s9\" (UID: \"cf96fe9c-423f-4e83-9e21-3a1128dc1f55\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.599491 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.599463 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/cf96fe9c-423f-4e83-9e21-3a1128dc1f55-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-s56s9\" (UID: \"cf96fe9c-423f-4e83-9e21-3a1128dc1f55\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.599728 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.599506 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcg7v\" (UniqueName: \"kubernetes.io/projected/cf96fe9c-423f-4e83-9e21-3a1128dc1f55-kube-api-access-lcg7v\") pod \"router-gateway-1-openshift-default-6c59fbf55c-s56s9\" (UID: \"cf96fe9c-423f-4e83-9e21-3a1128dc1f55\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.599728 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.599541 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cf96fe9c-423f-4e83-9e21-3a1128dc1f55-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-s56s9\" (UID: \"cf96fe9c-423f-4e83-9e21-3a1128dc1f55\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.599728 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.599625 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/cf96fe9c-423f-4e83-9e21-3a1128dc1f55-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-s56s9\" (UID: \"cf96fe9c-423f-4e83-9e21-3a1128dc1f55\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.700732 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.700689 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cf96fe9c-423f-4e83-9e21-3a1128dc1f55-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-s56s9\" (UID: \"cf96fe9c-423f-4e83-9e21-3a1128dc1f55\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.700922 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.700749 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/cf96fe9c-423f-4e83-9e21-3a1128dc1f55-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-s56s9\" (UID: \"cf96fe9c-423f-4e83-9e21-3a1128dc1f55\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.700922 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.700775 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/cf96fe9c-423f-4e83-9e21-3a1128dc1f55-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-s56s9\" (UID: \"cf96fe9c-423f-4e83-9e21-3a1128dc1f55\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.700922 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.700799 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/cf96fe9c-423f-4e83-9e21-3a1128dc1f55-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-s56s9\" (UID: \"cf96fe9c-423f-4e83-9e21-3a1128dc1f55\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.700922 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.700821 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lcg7v\" (UniqueName: \"kubernetes.io/projected/cf96fe9c-423f-4e83-9e21-3a1128dc1f55-kube-api-access-lcg7v\") pod \"router-gateway-1-openshift-default-6c59fbf55c-s56s9\" (UID: \"cf96fe9c-423f-4e83-9e21-3a1128dc1f55\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.700922 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.700853 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cf96fe9c-423f-4e83-9e21-3a1128dc1f55-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-s56s9\" (UID: \"cf96fe9c-423f-4e83-9e21-3a1128dc1f55\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.700922 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.700895 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/cf96fe9c-423f-4e83-9e21-3a1128dc1f55-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-s56s9\" (UID: \"cf96fe9c-423f-4e83-9e21-3a1128dc1f55\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.701257 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.700985 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/cf96fe9c-423f-4e83-9e21-3a1128dc1f55-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-s56s9\" (UID: \"cf96fe9c-423f-4e83-9e21-3a1128dc1f55\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.701257 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.701015 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/cf96fe9c-423f-4e83-9e21-3a1128dc1f55-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-s56s9\" (UID: \"cf96fe9c-423f-4e83-9e21-3a1128dc1f55\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.701467 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.701439 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/cf96fe9c-423f-4e83-9e21-3a1128dc1f55-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-s56s9\" (UID: \"cf96fe9c-423f-4e83-9e21-3a1128dc1f55\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.701467 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.701459 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/cf96fe9c-423f-4e83-9e21-3a1128dc1f55-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-s56s9\" (UID: \"cf96fe9c-423f-4e83-9e21-3a1128dc1f55\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.701619 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.701521 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/cf96fe9c-423f-4e83-9e21-3a1128dc1f55-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-s56s9\" (UID: \"cf96fe9c-423f-4e83-9e21-3a1128dc1f55\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.701656 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.701604 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/cf96fe9c-423f-4e83-9e21-3a1128dc1f55-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-s56s9\" (UID: \"cf96fe9c-423f-4e83-9e21-3a1128dc1f55\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.702057 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.702033 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/cf96fe9c-423f-4e83-9e21-3a1128dc1f55-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-s56s9\" (UID: \"cf96fe9c-423f-4e83-9e21-3a1128dc1f55\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.703114 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.703092 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/cf96fe9c-423f-4e83-9e21-3a1128dc1f55-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-s56s9\" (UID: \"cf96fe9c-423f-4e83-9e21-3a1128dc1f55\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.703437 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.703420 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cf96fe9c-423f-4e83-9e21-3a1128dc1f55-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-s56s9\" (UID: \"cf96fe9c-423f-4e83-9e21-3a1128dc1f55\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.710030 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.710010 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cf96fe9c-423f-4e83-9e21-3a1128dc1f55-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-s56s9\" (UID: \"cf96fe9c-423f-4e83-9e21-3a1128dc1f55\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.710288 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.710264 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcg7v\" (UniqueName: \"kubernetes.io/projected/cf96fe9c-423f-4e83-9e21-3a1128dc1f55-kube-api-access-lcg7v\") pod \"router-gateway-1-openshift-default-6c59fbf55c-s56s9\" (UID: \"cf96fe9c-423f-4e83-9e21-3a1128dc1f55\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.815042 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.815011 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:35.944280 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:35.944240 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9"] Apr 28 19:29:35.951258 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:29:35.951229 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf96fe9c_423f_4e83_9e21_3a1128dc1f55.slice/crio-3e531d7ee761fd8b16697a944b352cbd6bd9f7f4c361612b39c97d31601c454c WatchSource:0}: Error finding container 3e531d7ee761fd8b16697a944b352cbd6bd9f7f4c361612b39c97d31601c454c: Status 404 returned error can't find the container with id 3e531d7ee761fd8b16697a944b352cbd6bd9f7f4c361612b39c97d31601c454c Apr 28 19:29:36.473674 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:36.473633 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" event={"ID":"cf96fe9c-423f-4e83-9e21-3a1128dc1f55","Type":"ContainerStarted","Data":"3e531d7ee761fd8b16697a944b352cbd6bd9f7f4c361612b39c97d31601c454c"} Apr 28 19:29:39.050596 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:39.050552 2565 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 28 19:29:39.050957 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:39.050643 2565 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 28 19:29:39.050957 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:39.050686 2565 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 28 19:29:39.485547 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:39.485510 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" event={"ID":"cf96fe9c-423f-4e83-9e21-3a1128dc1f55","Type":"ContainerStarted","Data":"254e80f434c15f576859412c656663db3b1fd104033e58d171ab1fa399512d75"} Apr 28 19:29:39.507934 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:39.507888 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" podStartSLOduration=1.41075166 podStartE2EDuration="4.507873687s" podCreationTimestamp="2026-04-28 19:29:35 +0000 UTC" firstStartedPulling="2026-04-28 19:29:35.953213171 +0000 UTC m=+760.458929828" lastFinishedPulling="2026-04-28 19:29:39.050335212 +0000 UTC m=+763.556051855" observedRunningTime="2026-04-28 19:29:39.505489059 +0000 UTC m=+764.011205725" watchObservedRunningTime="2026-04-28 19:29:39.507873687 +0000 UTC m=+764.013590350" Apr 28 19:29:39.815769 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:39.815734 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:40.820455 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:40.820424 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:41.491287 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:41.491255 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:29:41.492241 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:29:41.492223 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-s56s9" Apr 28 19:30:07.083976 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.083942 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86"] Apr 28 19:30:07.094497 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.094462 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" Apr 28 19:30:07.098219 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.098192 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-vz4n6\"" Apr 28 19:30:07.099067 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.099046 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"gw-sec0c69dceeb48768325d1a53a749e65786-kserve-self-signed-certs\"" Apr 28 19:30:07.108677 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.108654 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86"] Apr 28 19:30:07.178332 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.178303 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-tmp-dir\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" Apr 28 19:30:07.178492 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.178347 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-tls-certs\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" Apr 28 19:30:07.178492 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.178370 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-dshm\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" Apr 28 19:30:07.178492 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.178451 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-kserve-provision-location\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" Apr 28 19:30:07.178661 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.178522 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-model-cache\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" Apr 28 19:30:07.178661 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.178557 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-home\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" Apr 28 19:30:07.178661 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.178583 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ghf6\" (UniqueName: \"kubernetes.io/projected/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-kube-api-access-2ghf6\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" Apr 28 19:30:07.279924 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.279891 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-tmp-dir\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" Apr 28 19:30:07.280107 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.279948 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-tls-certs\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" Apr 28 19:30:07.280107 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.279978 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-dshm\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" Apr 28 19:30:07.280107 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.280014 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-kserve-provision-location\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" Apr 28 19:30:07.280107 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.280069 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-model-cache\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" Apr 28 19:30:07.280107 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.280096 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-home\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" Apr 28 19:30:07.280390 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.280122 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ghf6\" (UniqueName: \"kubernetes.io/projected/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-kube-api-access-2ghf6\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" Apr 28 19:30:07.280390 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.280358 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-tmp-dir\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" Apr 28 19:30:07.280498 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.280464 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-model-cache\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" Apr 28 19:30:07.280558 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.280522 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-home\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" Apr 28 19:30:07.280600 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.280561 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-kserve-provision-location\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" Apr 28 19:30:07.282097 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.282077 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-dshm\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" Apr 28 19:30:07.282467 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.282441 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-tls-certs\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" Apr 28 19:30:07.288964 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.288942 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ghf6\" (UniqueName: \"kubernetes.io/projected/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-kube-api-access-2ghf6\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" Apr 28 19:30:07.406747 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.406674 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" Apr 28 19:30:07.528156 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.528126 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86"] Apr 28 19:30:07.531789 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:30:07.531760 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0a1117d_603a_4a6a_9ebd_58dee8ffe9d4.slice/crio-fdde3a1d1ce1d729e58187fd622d40d1ea04eeb8907f98cd740f8dd7b0119145 WatchSource:0}: Error finding container fdde3a1d1ce1d729e58187fd622d40d1ea04eeb8907f98cd740f8dd7b0119145: Status 404 returned error can't find the container with id fdde3a1d1ce1d729e58187fd622d40d1ea04eeb8907f98cd740f8dd7b0119145 Apr 28 19:30:07.581385 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:07.581349 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" event={"ID":"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4","Type":"ContainerStarted","Data":"fdde3a1d1ce1d729e58187fd622d40d1ea04eeb8907f98cd740f8dd7b0119145"} Apr 28 19:30:11.597505 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:11.597463 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" event={"ID":"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4","Type":"ContainerStarted","Data":"e14448890ac2f5ed0bcc7d70999f552b4e37ba9bc1b9b546bc36ffc9c1aa6a72"} Apr 28 19:30:19.050589 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:19.050557 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86"] Apr 28 19:30:19.050953 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:19.050795 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" podUID="e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4" containerName="storage-initializer" containerID="cri-o://e14448890ac2f5ed0bcc7d70999f552b4e37ba9bc1b9b546bc36ffc9c1aa6a72" gracePeriod=30 Apr 28 19:30:29.305594 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.305568 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" Apr 28 19:30:29.399739 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.399706 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-tls-certs\") pod \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " Apr 28 19:30:29.399925 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.399767 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-kserve-provision-location\") pod \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " Apr 28 19:30:29.399925 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.399830 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-model-cache\") pod \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " Apr 28 19:30:29.399925 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.399852 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-dshm\") pod \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " Apr 28 19:30:29.399925 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.399898 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-tmp-dir\") pod \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " Apr 28 19:30:29.399925 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.399919 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-home\") pod \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " Apr 28 19:30:29.400199 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.399953 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ghf6\" (UniqueName: \"kubernetes.io/projected/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-kube-api-access-2ghf6\") pod \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\" (UID: \"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4\") " Apr 28 19:30:29.400199 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.400069 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-model-cache" (OuterVolumeSpecName: "model-cache") pod "e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4" (UID: "e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:30:29.400326 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.400201 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-home" (OuterVolumeSpecName: "home") pod "e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4" (UID: "e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:30:29.400326 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.400211 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4" (UID: "e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:30:29.400417 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.400341 2565 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-model-cache\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:30:29.400417 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.400356 2565 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-tmp-dir\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:30:29.400417 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.400366 2565 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-home\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:30:29.401992 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.401968 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4" (UID: "e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:30:29.402336 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.402319 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-dshm" (OuterVolumeSpecName: "dshm") pod "e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4" (UID: "e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:30:29.402405 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.402346 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-kube-api-access-2ghf6" (OuterVolumeSpecName: "kube-api-access-2ghf6") pod "e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4" (UID: "e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4"). InnerVolumeSpecName "kube-api-access-2ghf6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:30:29.464274 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.464180 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4" (UID: "e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:30:29.501080 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.501044 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2ghf6\" (UniqueName: \"kubernetes.io/projected/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-kube-api-access-2ghf6\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:30:29.501080 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.501077 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-tls-certs\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:30:29.501286 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.501089 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-kserve-provision-location\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:30:29.501286 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.501098 2565 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4-dshm\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:30:29.655827 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.655789 2565 generic.go:358] "Generic (PLEG): container finished" podID="e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4" containerID="e14448890ac2f5ed0bcc7d70999f552b4e37ba9bc1b9b546bc36ffc9c1aa6a72" exitCode=0 Apr 28 19:30:29.655991 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.655851 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" Apr 28 19:30:29.655991 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.655875 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" event={"ID":"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4","Type":"ContainerDied","Data":"e14448890ac2f5ed0bcc7d70999f552b4e37ba9bc1b9b546bc36ffc9c1aa6a72"} Apr 28 19:30:29.655991 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.655914 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86" event={"ID":"e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4","Type":"ContainerDied","Data":"fdde3a1d1ce1d729e58187fd622d40d1ea04eeb8907f98cd740f8dd7b0119145"} Apr 28 19:30:29.655991 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.655930 2565 scope.go:117] "RemoveContainer" containerID="e14448890ac2f5ed0bcc7d70999f552b4e37ba9bc1b9b546bc36ffc9c1aa6a72" Apr 28 19:30:29.690298 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.690265 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86"] Apr 28 19:30:29.693622 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.693597 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-695f6f86879wz86"] Apr 28 19:30:29.737674 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.737655 2565 scope.go:117] "RemoveContainer" containerID="e14448890ac2f5ed0bcc7d70999f552b4e37ba9bc1b9b546bc36ffc9c1aa6a72" Apr 28 19:30:29.738015 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:30:29.737989 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e14448890ac2f5ed0bcc7d70999f552b4e37ba9bc1b9b546bc36ffc9c1aa6a72\": container with ID starting with e14448890ac2f5ed0bcc7d70999f552b4e37ba9bc1b9b546bc36ffc9c1aa6a72 not found: ID does not exist" containerID="e14448890ac2f5ed0bcc7d70999f552b4e37ba9bc1b9b546bc36ffc9c1aa6a72" Apr 28 19:30:29.738124 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:29.738021 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e14448890ac2f5ed0bcc7d70999f552b4e37ba9bc1b9b546bc36ffc9c1aa6a72"} err="failed to get container status \"e14448890ac2f5ed0bcc7d70999f552b4e37ba9bc1b9b546bc36ffc9c1aa6a72\": rpc error: code = NotFound desc = could not find container \"e14448890ac2f5ed0bcc7d70999f552b4e37ba9bc1b9b546bc36ffc9c1aa6a72\": container with ID starting with e14448890ac2f5ed0bcc7d70999f552b4e37ba9bc1b9b546bc36ffc9c1aa6a72 not found: ID does not exist" Apr 28 19:30:30.112332 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:30.112300 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4" path="/var/lib/kubelet/pods/e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4/volumes" Apr 28 19:30:40.960254 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:40.960222 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr"] Apr 28 19:30:40.960881 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:40.960714 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4" containerName="storage-initializer" Apr 28 19:30:40.960881 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:40.960731 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4" containerName="storage-initializer" Apr 28 19:30:40.960881 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:40.960815 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0a1117d-603a-4a6a-9ebd-58dee8ffe9d4" containerName="storage-initializer" Apr 28 19:30:40.965831 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:40.965809 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:40.970176 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:40.970141 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-vz4n6\"" Apr 28 19:30:40.970306 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:40.970146 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"gw-sec2774c263d49959f50d9eebc552e13bf9-kserve-self-signed-certs\"" Apr 28 19:30:40.978273 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:40.978250 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr"] Apr 28 19:30:41.012600 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:41.012576 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-model-cache\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:41.012730 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:41.012613 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6nw6\" (UniqueName: \"kubernetes.io/projected/35c57730-1188-486b-84fa-92cb82e52c73-kube-api-access-h6nw6\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:41.012730 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:41.012643 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-kserve-provision-location\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:41.012730 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:41.012706 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-home\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:41.012890 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:41.012782 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-tmp-dir\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:41.012890 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:41.012841 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-dshm\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:41.012890 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:41.012881 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/35c57730-1188-486b-84fa-92cb82e52c73-tls-certs\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:41.114046 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:41.114014 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-model-cache\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:41.114046 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:41.114049 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6nw6\" (UniqueName: \"kubernetes.io/projected/35c57730-1188-486b-84fa-92cb82e52c73-kube-api-access-h6nw6\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:41.114256 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:41.114071 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-kserve-provision-location\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:41.114256 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:41.114120 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-home\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:41.114256 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:41.114196 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-tmp-dir\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:41.114256 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:41.114228 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-dshm\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:41.114444 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:41.114256 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/35c57730-1188-486b-84fa-92cb82e52c73-tls-certs\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:41.114571 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:41.114543 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-kserve-provision-location\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:41.114571 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:41.114559 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-model-cache\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:41.114766 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:41.114629 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-home\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:41.114766 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:41.114671 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-tmp-dir\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:41.116593 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:41.116569 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-dshm\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:41.116788 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:41.116769 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/35c57730-1188-486b-84fa-92cb82e52c73-tls-certs\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:41.122752 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:41.122722 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6nw6\" (UniqueName: \"kubernetes.io/projected/35c57730-1188-486b-84fa-92cb82e52c73-kube-api-access-h6nw6\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:41.277583 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:41.277547 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:41.608931 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:41.608891 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr"] Apr 28 19:30:41.610965 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:30:41.610935 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35c57730_1188_486b_84fa_92cb82e52c73.slice/crio-dd09e633f48a1360637dd8d604a722a1d8539ab89c550f636fbee09f6db7e70d WatchSource:0}: Error finding container dd09e633f48a1360637dd8d604a722a1d8539ab89c550f636fbee09f6db7e70d: Status 404 returned error can't find the container with id dd09e633f48a1360637dd8d604a722a1d8539ab89c550f636fbee09f6db7e70d Apr 28 19:30:41.697186 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:41.697126 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" event={"ID":"35c57730-1188-486b-84fa-92cb82e52c73","Type":"ContainerStarted","Data":"1ea64efe4064f0db31ca3b689cad298b4881c9b71e6f5385a524170b3e3b6cac"} Apr 28 19:30:41.697324 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:41.697192 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" event={"ID":"35c57730-1188-486b-84fa-92cb82e52c73","Type":"ContainerStarted","Data":"dd09e633f48a1360637dd8d604a722a1d8539ab89c550f636fbee09f6db7e70d"} Apr 28 19:30:45.713040 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:45.713007 2565 generic.go:358] "Generic (PLEG): container finished" podID="35c57730-1188-486b-84fa-92cb82e52c73" containerID="1ea64efe4064f0db31ca3b689cad298b4881c9b71e6f5385a524170b3e3b6cac" exitCode=0 Apr 28 19:30:45.713354 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:45.713082 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" event={"ID":"35c57730-1188-486b-84fa-92cb82e52c73","Type":"ContainerDied","Data":"1ea64efe4064f0db31ca3b689cad298b4881c9b71e6f5385a524170b3e3b6cac"} Apr 28 19:30:47.722449 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:47.722415 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" event={"ID":"35c57730-1188-486b-84fa-92cb82e52c73","Type":"ContainerStarted","Data":"f116bba76cbd5c16dbc6cc5e28b3d2b1eecfdc0233fd64accf9e78fce4fa2729"} Apr 28 19:30:47.744574 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:47.744526 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" podStartSLOduration=6.472447026 podStartE2EDuration="7.744509159s" podCreationTimestamp="2026-04-28 19:30:40 +0000 UTC" firstStartedPulling="2026-04-28 19:30:45.714142212 +0000 UTC m=+830.219858857" lastFinishedPulling="2026-04-28 19:30:46.986204345 +0000 UTC m=+831.491920990" observedRunningTime="2026-04-28 19:30:47.741915242 +0000 UTC m=+832.247631908" watchObservedRunningTime="2026-04-28 19:30:47.744509159 +0000 UTC m=+832.250225848" Apr 28 19:30:51.278579 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:51.278541 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:51.278579 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:51.278587 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:51.290921 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:51.290893 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:51.749035 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:51.748954 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:59.062379 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.062344 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr"] Apr 28 19:30:59.062870 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.062661 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" podUID="35c57730-1188-486b-84fa-92cb82e52c73" containerName="main" containerID="cri-o://f116bba76cbd5c16dbc6cc5e28b3d2b1eecfdc0233fd64accf9e78fce4fa2729" gracePeriod=30 Apr 28 19:30:59.359581 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.359557 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:59.471636 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.471602 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-home\") pod \"35c57730-1188-486b-84fa-92cb82e52c73\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " Apr 28 19:30:59.471813 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.471665 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/35c57730-1188-486b-84fa-92cb82e52c73-tls-certs\") pod \"35c57730-1188-486b-84fa-92cb82e52c73\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " Apr 28 19:30:59.471813 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.471716 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-tmp-dir\") pod \"35c57730-1188-486b-84fa-92cb82e52c73\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " Apr 28 19:30:59.471813 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.471780 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-model-cache\") pod \"35c57730-1188-486b-84fa-92cb82e52c73\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " Apr 28 19:30:59.471813 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.471804 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-kserve-provision-location\") pod \"35c57730-1188-486b-84fa-92cb82e52c73\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " Apr 28 19:30:59.472028 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.471842 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-dshm\") pod \"35c57730-1188-486b-84fa-92cb82e52c73\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " Apr 28 19:30:59.472028 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.471872 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-home" (OuterVolumeSpecName: "home") pod "35c57730-1188-486b-84fa-92cb82e52c73" (UID: "35c57730-1188-486b-84fa-92cb82e52c73"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:30:59.472028 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.471883 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6nw6\" (UniqueName: \"kubernetes.io/projected/35c57730-1188-486b-84fa-92cb82e52c73-kube-api-access-h6nw6\") pod \"35c57730-1188-486b-84fa-92cb82e52c73\" (UID: \"35c57730-1188-486b-84fa-92cb82e52c73\") " Apr 28 19:30:59.472028 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.471980 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "35c57730-1188-486b-84fa-92cb82e52c73" (UID: "35c57730-1188-486b-84fa-92cb82e52c73"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:30:59.472258 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.472083 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-model-cache" (OuterVolumeSpecName: "model-cache") pod "35c57730-1188-486b-84fa-92cb82e52c73" (UID: "35c57730-1188-486b-84fa-92cb82e52c73"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:30:59.472297 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.472262 2565 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-tmp-dir\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:30:59.472297 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.472283 2565 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-model-cache\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:30:59.472376 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.472299 2565 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-home\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:30:59.474007 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.473980 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-dshm" (OuterVolumeSpecName: "dshm") pod "35c57730-1188-486b-84fa-92cb82e52c73" (UID: "35c57730-1188-486b-84fa-92cb82e52c73"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:30:59.474259 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.474241 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35c57730-1188-486b-84fa-92cb82e52c73-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "35c57730-1188-486b-84fa-92cb82e52c73" (UID: "35c57730-1188-486b-84fa-92cb82e52c73"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:30:59.474338 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.474261 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35c57730-1188-486b-84fa-92cb82e52c73-kube-api-access-h6nw6" (OuterVolumeSpecName: "kube-api-access-h6nw6") pod "35c57730-1188-486b-84fa-92cb82e52c73" (UID: "35c57730-1188-486b-84fa-92cb82e52c73"). InnerVolumeSpecName "kube-api-access-h6nw6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:30:59.573821 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.573783 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h6nw6\" (UniqueName: \"kubernetes.io/projected/35c57730-1188-486b-84fa-92cb82e52c73-kube-api-access-h6nw6\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:30:59.573821 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.573819 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/35c57730-1188-486b-84fa-92cb82e52c73-tls-certs\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:30:59.574006 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.573835 2565 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-dshm\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:30:59.764142 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.764104 2565 generic.go:358] "Generic (PLEG): container finished" podID="35c57730-1188-486b-84fa-92cb82e52c73" containerID="f116bba76cbd5c16dbc6cc5e28b3d2b1eecfdc0233fd64accf9e78fce4fa2729" exitCode=0 Apr 28 19:30:59.764320 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.764183 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" Apr 28 19:30:59.764320 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.764191 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" event={"ID":"35c57730-1188-486b-84fa-92cb82e52c73","Type":"ContainerDied","Data":"f116bba76cbd5c16dbc6cc5e28b3d2b1eecfdc0233fd64accf9e78fce4fa2729"} Apr 28 19:30:59.764320 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.764231 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr" event={"ID":"35c57730-1188-486b-84fa-92cb82e52c73","Type":"ContainerDied","Data":"dd09e633f48a1360637dd8d604a722a1d8539ab89c550f636fbee09f6db7e70d"} Apr 28 19:30:59.764320 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.764255 2565 scope.go:117] "RemoveContainer" containerID="f116bba76cbd5c16dbc6cc5e28b3d2b1eecfdc0233fd64accf9e78fce4fa2729" Apr 28 19:30:59.772775 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.772756 2565 scope.go:117] "RemoveContainer" containerID="1ea64efe4064f0db31ca3b689cad298b4881c9b71e6f5385a524170b3e3b6cac" Apr 28 19:30:59.834378 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.834354 2565 scope.go:117] "RemoveContainer" containerID="f116bba76cbd5c16dbc6cc5e28b3d2b1eecfdc0233fd64accf9e78fce4fa2729" Apr 28 19:30:59.834732 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:30:59.834713 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f116bba76cbd5c16dbc6cc5e28b3d2b1eecfdc0233fd64accf9e78fce4fa2729\": container with ID starting with f116bba76cbd5c16dbc6cc5e28b3d2b1eecfdc0233fd64accf9e78fce4fa2729 not found: ID does not exist" containerID="f116bba76cbd5c16dbc6cc5e28b3d2b1eecfdc0233fd64accf9e78fce4fa2729" Apr 28 19:30:59.834785 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.834744 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f116bba76cbd5c16dbc6cc5e28b3d2b1eecfdc0233fd64accf9e78fce4fa2729"} err="failed to get container status \"f116bba76cbd5c16dbc6cc5e28b3d2b1eecfdc0233fd64accf9e78fce4fa2729\": rpc error: code = NotFound desc = could not find container \"f116bba76cbd5c16dbc6cc5e28b3d2b1eecfdc0233fd64accf9e78fce4fa2729\": container with ID starting with f116bba76cbd5c16dbc6cc5e28b3d2b1eecfdc0233fd64accf9e78fce4fa2729 not found: ID does not exist" Apr 28 19:30:59.834785 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.834766 2565 scope.go:117] "RemoveContainer" containerID="1ea64efe4064f0db31ca3b689cad298b4881c9b71e6f5385a524170b3e3b6cac" Apr 28 19:30:59.835067 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:30:59.835046 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ea64efe4064f0db31ca3b689cad298b4881c9b71e6f5385a524170b3e3b6cac\": container with ID starting with 1ea64efe4064f0db31ca3b689cad298b4881c9b71e6f5385a524170b3e3b6cac not found: ID does not exist" containerID="1ea64efe4064f0db31ca3b689cad298b4881c9b71e6f5385a524170b3e3b6cac" Apr 28 19:30:59.835116 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:30:59.835074 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ea64efe4064f0db31ca3b689cad298b4881c9b71e6f5385a524170b3e3b6cac"} err="failed to get container status \"1ea64efe4064f0db31ca3b689cad298b4881c9b71e6f5385a524170b3e3b6cac\": rpc error: code = NotFound desc = could not find container \"1ea64efe4064f0db31ca3b689cad298b4881c9b71e6f5385a524170b3e3b6cac\": container with ID starting with 1ea64efe4064f0db31ca3b689cad298b4881c9b71e6f5385a524170b3e3b6cac not found: ID does not exist" Apr 28 19:31:01.604522 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:31:01.604476 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "35c57730-1188-486b-84fa-92cb82e52c73" (UID: "35c57730-1188-486b-84fa-92cb82e52c73"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:31:01.696837 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:31:01.696800 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35c57730-1188-486b-84fa-92cb82e52c73-kserve-provision-location\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:31:01.886927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:31:01.886898 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr"] Apr 28 19:31:01.890593 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:31:01.890567 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-74f767b46d98qcr"] Apr 28 19:31:02.111083 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:31:02.111051 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35c57730-1188-486b-84fa-92cb82e52c73" path="/var/lib/kubelet/pods/35c57730-1188-486b-84fa-92cb82e52c73/volumes" Apr 28 19:36:34.082139 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.082098 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp"] Apr 28 19:36:34.082580 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.082546 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="35c57730-1188-486b-84fa-92cb82e52c73" containerName="storage-initializer" Apr 28 19:36:34.082580 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.082563 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c57730-1188-486b-84fa-92cb82e52c73" containerName="storage-initializer" Apr 28 19:36:34.082580 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.082575 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="35c57730-1188-486b-84fa-92cb82e52c73" containerName="main" Apr 28 19:36:34.082580 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.082581 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c57730-1188-486b-84fa-92cb82e52c73" containerName="main" Apr 28 19:36:34.082720 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.082632 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="35c57730-1188-486b-84fa-92cb82e52c73" containerName="main" Apr 28 19:36:34.085733 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.085715 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:36:34.088362 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.088335 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 28 19:36:34.088487 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.088455 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-vz4n6\"" Apr 28 19:36:34.096424 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.096397 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp"] Apr 28 19:36:34.124424 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.124398 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:36:34.124567 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.124440 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmblg\" (UniqueName: \"kubernetes.io/projected/08c281f3-e613-4594-a94e-8dd63066737e-kube-api-access-wmblg\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:36:34.124567 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.124479 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:36:34.124567 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.124503 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:36:34.124567 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.124527 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:36:34.124772 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.124584 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/08c281f3-e613-4594-a94e-8dd63066737e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:36:34.124772 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.124651 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:36:34.226135 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.226100 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:36:34.226320 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.226152 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wmblg\" (UniqueName: \"kubernetes.io/projected/08c281f3-e613-4594-a94e-8dd63066737e-kube-api-access-wmblg\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:36:34.226320 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.226214 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:36:34.226320 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.226239 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:36:34.226320 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.226276 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:36:34.226320 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.226308 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/08c281f3-e613-4594-a94e-8dd63066737e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:36:34.226554 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.226346 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:36:34.226612 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.226554 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:36:34.226666 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.226644 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:36:34.226719 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.226674 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:36:34.226769 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.226731 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:36:34.228629 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.228603 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:36:34.228844 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.228823 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/08c281f3-e613-4594-a94e-8dd63066737e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:36:34.234586 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.234561 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmblg\" (UniqueName: \"kubernetes.io/projected/08c281f3-e613-4594-a94e-8dd63066737e-kube-api-access-wmblg\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:36:34.397714 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.397631 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:36:34.528748 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.528724 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp"] Apr 28 19:36:34.530835 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:36:34.530805 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08c281f3_e613_4594_a94e_8dd63066737e.slice/crio-1047f2584a3d374f2adb40a754645508a504a8df1dcda0137f299e7fe9b4ac01 WatchSource:0}: Error finding container 1047f2584a3d374f2adb40a754645508a504a8df1dcda0137f299e7fe9b4ac01: Status 404 returned error can't find the container with id 1047f2584a3d374f2adb40a754645508a504a8df1dcda0137f299e7fe9b4ac01 Apr 28 19:36:34.532653 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.532634 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:36:34.861088 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.861039 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" event={"ID":"08c281f3-e613-4594-a94e-8dd63066737e","Type":"ContainerStarted","Data":"1e9b5ed85a04d673223be74051ad09c20c929c058af8c7a2467d73d6220a002e"} Apr 28 19:36:34.861088 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:36:34.861087 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" event={"ID":"08c281f3-e613-4594-a94e-8dd63066737e","Type":"ContainerStarted","Data":"1047f2584a3d374f2adb40a754645508a504a8df1dcda0137f299e7fe9b4ac01"} Apr 28 19:41:13.781633 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:41:13.781596 2565 generic.go:358] "Generic (PLEG): container finished" podID="08c281f3-e613-4594-a94e-8dd63066737e" containerID="1e9b5ed85a04d673223be74051ad09c20c929c058af8c7a2467d73d6220a002e" exitCode=0 Apr 28 19:41:13.781633 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:41:13.781638 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" event={"ID":"08c281f3-e613-4594-a94e-8dd63066737e","Type":"ContainerDied","Data":"1e9b5ed85a04d673223be74051ad09c20c929c058af8c7a2467d73d6220a002e"} Apr 28 19:42:02.975815 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:42:02.975772 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" event={"ID":"08c281f3-e613-4594-a94e-8dd63066737e","Type":"ContainerStarted","Data":"3a4190bc4ee9908ba0da79c1042808bbda5ba05a672953b41ac2c38005d9dacc"} Apr 28 19:42:03.001289 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:42:03.001238 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" podStartSLOduration=280.369776764 podStartE2EDuration="5m29.001225051s" podCreationTimestamp="2026-04-28 19:36:34 +0000 UTC" firstStartedPulling="2026-04-28 19:41:13.782756254 +0000 UTC m=+1458.288472899" lastFinishedPulling="2026-04-28 19:42:02.414204525 +0000 UTC m=+1506.919921186" observedRunningTime="2026-04-28 19:42:02.999057721 +0000 UTC m=+1507.504774389" watchObservedRunningTime="2026-04-28 19:42:03.001225051 +0000 UTC m=+1507.506941736" Apr 28 19:42:04.398636 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:42:04.398596 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:42:04.398636 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:42:04.398638 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:42:04.400149 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:42:04.400090 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" podUID="08c281f3-e613-4594-a94e-8dd63066737e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 28 19:42:14.398685 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:42:14.398600 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" podUID="08c281f3-e613-4594-a94e-8dd63066737e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 28 19:42:24.398847 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:42:24.398794 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" podUID="08c281f3-e613-4594-a94e-8dd63066737e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 28 19:42:34.398227 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:42:34.398176 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" podUID="08c281f3-e613-4594-a94e-8dd63066737e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 28 19:42:44.398078 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:42:44.398033 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" podUID="08c281f3-e613-4594-a94e-8dd63066737e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 28 19:42:54.398886 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:42:54.398835 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" podUID="08c281f3-e613-4594-a94e-8dd63066737e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 28 19:43:04.399179 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:04.399098 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" podUID="08c281f3-e613-4594-a94e-8dd63066737e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 28 19:43:14.398549 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:14.398506 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" podUID="08c281f3-e613-4594-a94e-8dd63066737e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 28 19:43:24.398598 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:24.398554 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" podUID="08c281f3-e613-4594-a94e-8dd63066737e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 28 19:43:34.408752 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:34.408719 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:43:34.417087 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:34.417058 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:43:39.993714 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:39.993680 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp"] Apr 28 19:43:39.994226 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:39.993979 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" podUID="08c281f3-e613-4594-a94e-8dd63066737e" containerName="main" containerID="cri-o://3a4190bc4ee9908ba0da79c1042808bbda5ba05a672953b41ac2c38005d9dacc" gracePeriod=30 Apr 28 19:43:49.680598 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:49.680560 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987"] Apr 28 19:43:49.685773 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:49.685752 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:43:49.688460 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:49.688440 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 28 19:43:49.695449 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:49.695425 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987"] Apr 28 19:43:49.762927 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:49.762894 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-tmp-dir\") pod \"custom-route-timeout-test-kserve-6ff7c96cb4-j9987\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:43:49.763086 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:49.762991 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-model-cache\") pod \"custom-route-timeout-test-kserve-6ff7c96cb4-j9987\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:43:49.763086 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:49.763033 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-home\") pod \"custom-route-timeout-test-kserve-6ff7c96cb4-j9987\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:43:49.763086 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:49.763047 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-dshm\") pod \"custom-route-timeout-test-kserve-6ff7c96cb4-j9987\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:43:49.763086 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:49.763070 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txq6k\" (UniqueName: \"kubernetes.io/projected/66c088e2-be4b-40ff-b545-7fa3cf860b94-kube-api-access-txq6k\") pod \"custom-route-timeout-test-kserve-6ff7c96cb4-j9987\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:43:49.763249 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:49.763090 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/66c088e2-be4b-40ff-b545-7fa3cf860b94-tls-certs\") pod \"custom-route-timeout-test-kserve-6ff7c96cb4-j9987\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:43:49.763249 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:49.763187 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-6ff7c96cb4-j9987\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:43:49.863821 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:49.863783 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-model-cache\") pod \"custom-route-timeout-test-kserve-6ff7c96cb4-j9987\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:43:49.863821 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:49.863826 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-home\") pod \"custom-route-timeout-test-kserve-6ff7c96cb4-j9987\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:43:49.864039 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:49.863842 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-dshm\") pod \"custom-route-timeout-test-kserve-6ff7c96cb4-j9987\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:43:49.864039 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:49.863971 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-txq6k\" (UniqueName: \"kubernetes.io/projected/66c088e2-be4b-40ff-b545-7fa3cf860b94-kube-api-access-txq6k\") pod \"custom-route-timeout-test-kserve-6ff7c96cb4-j9987\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:43:49.864039 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:49.864007 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/66c088e2-be4b-40ff-b545-7fa3cf860b94-tls-certs\") pod \"custom-route-timeout-test-kserve-6ff7c96cb4-j9987\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:43:49.864225 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:49.864050 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-6ff7c96cb4-j9987\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:43:49.864225 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:49.864096 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-tmp-dir\") pod \"custom-route-timeout-test-kserve-6ff7c96cb4-j9987\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:43:49.864225 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:49.864202 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-model-cache\") pod \"custom-route-timeout-test-kserve-6ff7c96cb4-j9987\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:43:49.864385 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:49.864294 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-home\") pod \"custom-route-timeout-test-kserve-6ff7c96cb4-j9987\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:43:49.864485 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:49.864458 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-6ff7c96cb4-j9987\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:43:49.864600 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:49.864526 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-tmp-dir\") pod \"custom-route-timeout-test-kserve-6ff7c96cb4-j9987\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:43:49.866127 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:49.866107 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-dshm\") pod \"custom-route-timeout-test-kserve-6ff7c96cb4-j9987\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:43:49.866481 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:49.866462 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/66c088e2-be4b-40ff-b545-7fa3cf860b94-tls-certs\") pod \"custom-route-timeout-test-kserve-6ff7c96cb4-j9987\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:43:49.872223 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:49.872193 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-txq6k\" (UniqueName: \"kubernetes.io/projected/66c088e2-be4b-40ff-b545-7fa3cf860b94-kube-api-access-txq6k\") pod \"custom-route-timeout-test-kserve-6ff7c96cb4-j9987\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:43:49.997206 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:49.997112 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:43:50.120994 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:50.120968 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987"] Apr 28 19:43:50.123272 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:43:50.123217 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66c088e2_be4b_40ff_b545_7fa3cf860b94.slice/crio-cab06e6409cee5613bc1fa781131244bce676beaf44ef6f5e6011b9fbdb643ba WatchSource:0}: Error finding container cab06e6409cee5613bc1fa781131244bce676beaf44ef6f5e6011b9fbdb643ba: Status 404 returned error can't find the container with id cab06e6409cee5613bc1fa781131244bce676beaf44ef6f5e6011b9fbdb643ba Apr 28 19:43:50.125030 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:50.125010 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:43:50.361699 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:50.361660 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" event={"ID":"66c088e2-be4b-40ff-b545-7fa3cf860b94","Type":"ContainerStarted","Data":"7fbb40f3730c98f1112c7421d5e2ec6a71d7e9928081e5926818e1b7470b746b"} Apr 28 19:43:50.361699 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:43:50.361699 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" event={"ID":"66c088e2-be4b-40ff-b545-7fa3cf860b94","Type":"ContainerStarted","Data":"cab06e6409cee5613bc1fa781131244bce676beaf44ef6f5e6011b9fbdb643ba"} Apr 28 19:44:10.224651 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.224620 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp_08c281f3-e613-4594-a94e-8dd63066737e/main/0.log" Apr 28 19:44:10.224967 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.224951 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:44:10.354244 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.354209 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-home\") pod \"08c281f3-e613-4594-a94e-8dd63066737e\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " Apr 28 19:44:10.354403 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.354258 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-kserve-provision-location\") pod \"08c281f3-e613-4594-a94e-8dd63066737e\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " Apr 28 19:44:10.354403 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.354280 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmblg\" (UniqueName: \"kubernetes.io/projected/08c281f3-e613-4594-a94e-8dd63066737e-kube-api-access-wmblg\") pod \"08c281f3-e613-4594-a94e-8dd63066737e\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " Apr 28 19:44:10.354403 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.354359 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-tmp-dir\") pod \"08c281f3-e613-4594-a94e-8dd63066737e\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " Apr 28 19:44:10.354539 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.354410 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-model-cache\") pod \"08c281f3-e613-4594-a94e-8dd63066737e\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " Apr 28 19:44:10.354539 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.354425 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/08c281f3-e613-4594-a94e-8dd63066737e-tls-certs\") pod \"08c281f3-e613-4594-a94e-8dd63066737e\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " Apr 28 19:44:10.354539 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.354445 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-dshm\") pod \"08c281f3-e613-4594-a94e-8dd63066737e\" (UID: \"08c281f3-e613-4594-a94e-8dd63066737e\") " Apr 28 19:44:10.354745 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.354715 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-model-cache" (OuterVolumeSpecName: "model-cache") pod "08c281f3-e613-4594-a94e-8dd63066737e" (UID: "08c281f3-e613-4594-a94e-8dd63066737e"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:44:10.354946 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.354912 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-home" (OuterVolumeSpecName: "home") pod "08c281f3-e613-4594-a94e-8dd63066737e" (UID: "08c281f3-e613-4594-a94e-8dd63066737e"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:44:10.356667 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.356633 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c281f3-e613-4594-a94e-8dd63066737e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "08c281f3-e613-4594-a94e-8dd63066737e" (UID: "08c281f3-e613-4594-a94e-8dd63066737e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:44:10.356750 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.356699 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-dshm" (OuterVolumeSpecName: "dshm") pod "08c281f3-e613-4594-a94e-8dd63066737e" (UID: "08c281f3-e613-4594-a94e-8dd63066737e"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:44:10.356832 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.356814 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08c281f3-e613-4594-a94e-8dd63066737e-kube-api-access-wmblg" (OuterVolumeSpecName: "kube-api-access-wmblg") pod "08c281f3-e613-4594-a94e-8dd63066737e" (UID: "08c281f3-e613-4594-a94e-8dd63066737e"). InnerVolumeSpecName "kube-api-access-wmblg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:44:10.367115 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.367092 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "08c281f3-e613-4594-a94e-8dd63066737e" (UID: "08c281f3-e613-4594-a94e-8dd63066737e"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:44:10.414481 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.414456 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "08c281f3-e613-4594-a94e-8dd63066737e" (UID: "08c281f3-e613-4594-a94e-8dd63066737e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:44:10.426937 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.426911 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp_08c281f3-e613-4594-a94e-8dd63066737e/main/0.log" Apr 28 19:44:10.427269 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.427248 2565 generic.go:358] "Generic (PLEG): container finished" podID="08c281f3-e613-4594-a94e-8dd63066737e" containerID="3a4190bc4ee9908ba0da79c1042808bbda5ba05a672953b41ac2c38005d9dacc" exitCode=137 Apr 28 19:44:10.427342 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.427327 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" Apr 28 19:44:10.427383 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.427339 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" event={"ID":"08c281f3-e613-4594-a94e-8dd63066737e","Type":"ContainerDied","Data":"3a4190bc4ee9908ba0da79c1042808bbda5ba05a672953b41ac2c38005d9dacc"} Apr 28 19:44:10.427421 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.427381 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp" event={"ID":"08c281f3-e613-4594-a94e-8dd63066737e","Type":"ContainerDied","Data":"1047f2584a3d374f2adb40a754645508a504a8df1dcda0137f299e7fe9b4ac01"} Apr 28 19:44:10.427421 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.427400 2565 scope.go:117] "RemoveContainer" containerID="3a4190bc4ee9908ba0da79c1042808bbda5ba05a672953b41ac2c38005d9dacc" Apr 28 19:44:10.435614 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.435596 2565 scope.go:117] "RemoveContainer" containerID="1e9b5ed85a04d673223be74051ad09c20c929c058af8c7a2467d73d6220a002e" Apr 28 19:44:10.451447 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.451413 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp"] Apr 28 19:44:10.453543 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.453522 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-77d447484-qhccp"] Apr 28 19:44:10.453701 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.453690 2565 scope.go:117] "RemoveContainer" containerID="3a4190bc4ee9908ba0da79c1042808bbda5ba05a672953b41ac2c38005d9dacc" Apr 28 19:44:10.453961 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:44:10.453937 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a4190bc4ee9908ba0da79c1042808bbda5ba05a672953b41ac2c38005d9dacc\": container with ID starting with 3a4190bc4ee9908ba0da79c1042808bbda5ba05a672953b41ac2c38005d9dacc not found: ID does not exist" containerID="3a4190bc4ee9908ba0da79c1042808bbda5ba05a672953b41ac2c38005d9dacc" Apr 28 19:44:10.454049 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.453967 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a4190bc4ee9908ba0da79c1042808bbda5ba05a672953b41ac2c38005d9dacc"} err="failed to get container status \"3a4190bc4ee9908ba0da79c1042808bbda5ba05a672953b41ac2c38005d9dacc\": rpc error: code = NotFound desc = could not find container \"3a4190bc4ee9908ba0da79c1042808bbda5ba05a672953b41ac2c38005d9dacc\": container with ID starting with 3a4190bc4ee9908ba0da79c1042808bbda5ba05a672953b41ac2c38005d9dacc not found: ID does not exist" Apr 28 19:44:10.454049 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.453986 2565 scope.go:117] "RemoveContainer" containerID="1e9b5ed85a04d673223be74051ad09c20c929c058af8c7a2467d73d6220a002e" Apr 28 19:44:10.454284 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:44:10.454264 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e9b5ed85a04d673223be74051ad09c20c929c058af8c7a2467d73d6220a002e\": container with ID starting with 1e9b5ed85a04d673223be74051ad09c20c929c058af8c7a2467d73d6220a002e not found: ID does not exist" containerID="1e9b5ed85a04d673223be74051ad09c20c929c058af8c7a2467d73d6220a002e" Apr 28 19:44:10.454350 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.454294 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e9b5ed85a04d673223be74051ad09c20c929c058af8c7a2467d73d6220a002e"} err="failed to get container status \"1e9b5ed85a04d673223be74051ad09c20c929c058af8c7a2467d73d6220a002e\": rpc error: code = NotFound desc = could not find container \"1e9b5ed85a04d673223be74051ad09c20c929c058af8c7a2467d73d6220a002e\": container with ID starting with 1e9b5ed85a04d673223be74051ad09c20c929c058af8c7a2467d73d6220a002e not found: ID does not exist" Apr 28 19:44:10.455671 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.455655 2565 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-tmp-dir\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:44:10.455724 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.455673 2565 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-model-cache\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:44:10.455724 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.455683 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/08c281f3-e613-4594-a94e-8dd63066737e-tls-certs\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:44:10.455724 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.455691 2565 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-dshm\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:44:10.455724 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.455698 2565 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-home\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:44:10.455724 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.455706 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08c281f3-e613-4594-a94e-8dd63066737e-kserve-provision-location\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:44:10.455724 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:10.455715 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wmblg\" (UniqueName: \"kubernetes.io/projected/08c281f3-e613-4594-a94e-8dd63066737e-kube-api-access-wmblg\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:44:12.110716 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:44:12.110682 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08c281f3-e613-4594-a94e-8dd63066737e" path="/var/lib/kubelet/pods/08c281f3-e613-4594-a94e-8dd63066737e/volumes" Apr 28 19:48:02.208993 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:48:02.208958 2565 generic.go:358] "Generic (PLEG): container finished" podID="66c088e2-be4b-40ff-b545-7fa3cf860b94" containerID="7fbb40f3730c98f1112c7421d5e2ec6a71d7e9928081e5926818e1b7470b746b" exitCode=0 Apr 28 19:48:02.209426 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:48:02.209023 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" event={"ID":"66c088e2-be4b-40ff-b545-7fa3cf860b94","Type":"ContainerDied","Data":"7fbb40f3730c98f1112c7421d5e2ec6a71d7e9928081e5926818e1b7470b746b"} Apr 28 19:48:03.214149 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:48:03.214114 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" event={"ID":"66c088e2-be4b-40ff-b545-7fa3cf860b94","Type":"ContainerStarted","Data":"a646874780a713e02516528938336c088f5a1a56f91dd67b93079306a24ddcf8"} Apr 28 19:48:03.237057 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:48:03.237006 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" podStartSLOduration=254.236991193 podStartE2EDuration="4m14.236991193s" podCreationTimestamp="2026-04-28 19:43:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:48:03.234516105 +0000 UTC m=+1867.740232772" watchObservedRunningTime="2026-04-28 19:48:03.236991193 +0000 UTC m=+1867.742707960" Apr 28 19:48:09.998150 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:48:09.998103 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:48:09.998740 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:48:09.998191 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:48:09.999734 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:48:09.999702 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" podUID="66c088e2-be4b-40ff-b545-7fa3cf860b94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 28 19:48:19.998404 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:48:19.998351 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" podUID="66c088e2-be4b-40ff-b545-7fa3cf860b94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 28 19:48:29.997675 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:48:29.997617 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" podUID="66c088e2-be4b-40ff-b545-7fa3cf860b94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 28 19:48:39.997763 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:48:39.997714 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" podUID="66c088e2-be4b-40ff-b545-7fa3cf860b94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 28 19:48:49.998343 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:48:49.998296 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" podUID="66c088e2-be4b-40ff-b545-7fa3cf860b94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 28 19:48:59.998536 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:48:59.998488 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" podUID="66c088e2-be4b-40ff-b545-7fa3cf860b94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 28 19:49:09.998353 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:09.998307 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" podUID="66c088e2-be4b-40ff-b545-7fa3cf860b94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 28 19:49:19.998121 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:19.998073 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" podUID="66c088e2-be4b-40ff-b545-7fa3cf860b94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 28 19:49:29.997972 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:29.997929 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" podUID="66c088e2-be4b-40ff-b545-7fa3cf860b94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 28 19:49:40.007501 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:40.007469 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:49:40.015369 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:40.015344 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:49:45.554520 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:45.554438 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987"] Apr 28 19:49:45.554958 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:45.554720 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" podUID="66c088e2-be4b-40ff-b545-7fa3cf860b94" containerName="main" containerID="cri-o://a646874780a713e02516528938336c088f5a1a56f91dd67b93079306a24ddcf8" gracePeriod=30 Apr 28 19:49:54.761332 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.761296 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv"] Apr 28 19:49:54.761818 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.761778 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08c281f3-e613-4594-a94e-8dd63066737e" containerName="storage-initializer" Apr 28 19:49:54.761933 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.761824 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c281f3-e613-4594-a94e-8dd63066737e" containerName="storage-initializer" Apr 28 19:49:54.761933 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.761838 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08c281f3-e613-4594-a94e-8dd63066737e" containerName="main" Apr 28 19:49:54.761933 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.761848 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c281f3-e613-4594-a94e-8dd63066737e" containerName="main" Apr 28 19:49:54.762081 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.762029 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="08c281f3-e613-4594-a94e-8dd63066737e" containerName="main" Apr 28 19:49:54.767555 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.767535 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:49:54.770068 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.770043 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 28 19:49:54.778935 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.778908 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv"] Apr 28 19:49:54.822836 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.822808 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-kserve-provision-location\") pod \"router-with-refs-test-kserve-54946d576-jgddv\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:49:54.822974 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.822848 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lddh2\" (UniqueName: \"kubernetes.io/projected/1fa649ce-1bc4-47b4-8212-72028c1fed4d-kube-api-access-lddh2\") pod \"router-with-refs-test-kserve-54946d576-jgddv\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:49:54.822974 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.822883 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-dshm\") pod \"router-with-refs-test-kserve-54946d576-jgddv\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:49:54.822974 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.822927 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-model-cache\") pod \"router-with-refs-test-kserve-54946d576-jgddv\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:49:54.822974 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.822955 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-tmp-dir\") pod \"router-with-refs-test-kserve-54946d576-jgddv\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:49:54.823132 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.822981 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1fa649ce-1bc4-47b4-8212-72028c1fed4d-tls-certs\") pod \"router-with-refs-test-kserve-54946d576-jgddv\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:49:54.823132 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.823004 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-home\") pod \"router-with-refs-test-kserve-54946d576-jgddv\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:49:54.924381 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.924340 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lddh2\" (UniqueName: \"kubernetes.io/projected/1fa649ce-1bc4-47b4-8212-72028c1fed4d-kube-api-access-lddh2\") pod \"router-with-refs-test-kserve-54946d576-jgddv\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:49:54.924565 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.924408 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-dshm\") pod \"router-with-refs-test-kserve-54946d576-jgddv\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:49:54.924565 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.924432 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-model-cache\") pod \"router-with-refs-test-kserve-54946d576-jgddv\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:49:54.924565 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.924451 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-tmp-dir\") pod \"router-with-refs-test-kserve-54946d576-jgddv\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:49:54.924565 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.924476 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1fa649ce-1bc4-47b4-8212-72028c1fed4d-tls-certs\") pod \"router-with-refs-test-kserve-54946d576-jgddv\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:49:54.924565 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.924497 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-home\") pod \"router-with-refs-test-kserve-54946d576-jgddv\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:49:54.924565 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.924540 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-kserve-provision-location\") pod \"router-with-refs-test-kserve-54946d576-jgddv\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:49:54.925005 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.924968 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-model-cache\") pod \"router-with-refs-test-kserve-54946d576-jgddv\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:49:54.925132 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.924996 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-kserve-provision-location\") pod \"router-with-refs-test-kserve-54946d576-jgddv\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:49:54.925132 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.925065 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-tmp-dir\") pod \"router-with-refs-test-kserve-54946d576-jgddv\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:49:54.925264 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.925153 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-home\") pod \"router-with-refs-test-kserve-54946d576-jgddv\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:49:54.926856 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.926833 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-dshm\") pod \"router-with-refs-test-kserve-54946d576-jgddv\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:49:54.927140 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.927121 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1fa649ce-1bc4-47b4-8212-72028c1fed4d-tls-certs\") pod \"router-with-refs-test-kserve-54946d576-jgddv\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:49:54.932145 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:54.932119 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lddh2\" (UniqueName: \"kubernetes.io/projected/1fa649ce-1bc4-47b4-8212-72028c1fed4d-kube-api-access-lddh2\") pod \"router-with-refs-test-kserve-54946d576-jgddv\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:49:55.080411 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:55.080326 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:49:55.205121 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:55.205086 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv"] Apr 28 19:49:55.209034 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:49:55.209006 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fa649ce_1bc4_47b4_8212_72028c1fed4d.slice/crio-902e830ddeac3ed78dce6100ab29b8fbf4d5c3420cb5b58ed79e1aa1d70c664b WatchSource:0}: Error finding container 902e830ddeac3ed78dce6100ab29b8fbf4d5c3420cb5b58ed79e1aa1d70c664b: Status 404 returned error can't find the container with id 902e830ddeac3ed78dce6100ab29b8fbf4d5c3420cb5b58ed79e1aa1d70c664b Apr 28 19:49:55.210796 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:55.210780 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:49:55.613445 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:55.613408 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" event={"ID":"1fa649ce-1bc4-47b4-8212-72028c1fed4d","Type":"ContainerStarted","Data":"d8247f01cfde49c575bb83a70e1489b95f670b11800f29149a64f038c6524547"} Apr 28 19:49:55.613445 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:49:55.613449 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" event={"ID":"1fa649ce-1bc4-47b4-8212-72028c1fed4d","Type":"ContainerStarted","Data":"902e830ddeac3ed78dce6100ab29b8fbf4d5c3420cb5b58ed79e1aa1d70c664b"} Apr 28 19:50:15.685924 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:15.685896 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-6ff7c96cb4-j9987_66c088e2-be4b-40ff-b545-7fa3cf860b94/main/0.log" Apr 28 19:50:15.686320 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:15.686220 2565 generic.go:358] "Generic (PLEG): container finished" podID="66c088e2-be4b-40ff-b545-7fa3cf860b94" containerID="a646874780a713e02516528938336c088f5a1a56f91dd67b93079306a24ddcf8" exitCode=137 Apr 28 19:50:15.686320 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:15.686267 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" event={"ID":"66c088e2-be4b-40ff-b545-7fa3cf860b94","Type":"ContainerDied","Data":"a646874780a713e02516528938336c088f5a1a56f91dd67b93079306a24ddcf8"} Apr 28 19:50:15.787288 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:15.787264 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-6ff7c96cb4-j9987_66c088e2-be4b-40ff-b545-7fa3cf860b94/main/0.log" Apr 28 19:50:15.787627 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:15.787611 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:50:15.911955 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:15.911863 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-kserve-provision-location\") pod \"66c088e2-be4b-40ff-b545-7fa3cf860b94\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " Apr 28 19:50:15.911955 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:15.911914 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-model-cache\") pod \"66c088e2-be4b-40ff-b545-7fa3cf860b94\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " Apr 28 19:50:15.911955 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:15.911935 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/66c088e2-be4b-40ff-b545-7fa3cf860b94-tls-certs\") pod \"66c088e2-be4b-40ff-b545-7fa3cf860b94\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " Apr 28 19:50:15.912287 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:15.911967 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-home\") pod \"66c088e2-be4b-40ff-b545-7fa3cf860b94\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " Apr 28 19:50:15.912287 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:15.911983 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-dshm\") pod \"66c088e2-be4b-40ff-b545-7fa3cf860b94\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " Apr 28 19:50:15.912287 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:15.912009 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-tmp-dir\") pod \"66c088e2-be4b-40ff-b545-7fa3cf860b94\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " Apr 28 19:50:15.912287 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:15.912074 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txq6k\" (UniqueName: \"kubernetes.io/projected/66c088e2-be4b-40ff-b545-7fa3cf860b94-kube-api-access-txq6k\") pod \"66c088e2-be4b-40ff-b545-7fa3cf860b94\" (UID: \"66c088e2-be4b-40ff-b545-7fa3cf860b94\") " Apr 28 19:50:15.912287 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:15.912237 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-model-cache" (OuterVolumeSpecName: "model-cache") pod "66c088e2-be4b-40ff-b545-7fa3cf860b94" (UID: "66c088e2-be4b-40ff-b545-7fa3cf860b94"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:50:15.912583 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:15.912409 2565 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-model-cache\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:50:15.912878 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:15.912851 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-home" (OuterVolumeSpecName: "home") pod "66c088e2-be4b-40ff-b545-7fa3cf860b94" (UID: "66c088e2-be4b-40ff-b545-7fa3cf860b94"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:50:15.914278 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:15.914245 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-dshm" (OuterVolumeSpecName: "dshm") pod "66c088e2-be4b-40ff-b545-7fa3cf860b94" (UID: "66c088e2-be4b-40ff-b545-7fa3cf860b94"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:50:15.914389 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:15.914364 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66c088e2-be4b-40ff-b545-7fa3cf860b94-kube-api-access-txq6k" (OuterVolumeSpecName: "kube-api-access-txq6k") pod "66c088e2-be4b-40ff-b545-7fa3cf860b94" (UID: "66c088e2-be4b-40ff-b545-7fa3cf860b94"). InnerVolumeSpecName "kube-api-access-txq6k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:50:15.914555 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:15.914533 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66c088e2-be4b-40ff-b545-7fa3cf860b94-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "66c088e2-be4b-40ff-b545-7fa3cf860b94" (UID: "66c088e2-be4b-40ff-b545-7fa3cf860b94"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:50:15.931079 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:15.931045 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "66c088e2-be4b-40ff-b545-7fa3cf860b94" (UID: "66c088e2-be4b-40ff-b545-7fa3cf860b94"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:50:15.973918 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:15.973882 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "66c088e2-be4b-40ff-b545-7fa3cf860b94" (UID: "66c088e2-be4b-40ff-b545-7fa3cf860b94"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:50:16.013838 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:16.013810 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-kserve-provision-location\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:50:16.013838 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:16.013838 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/66c088e2-be4b-40ff-b545-7fa3cf860b94-tls-certs\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:50:16.013972 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:16.013850 2565 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-home\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:50:16.013972 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:16.013859 2565 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-dshm\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:50:16.013972 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:16.013867 2565 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/66c088e2-be4b-40ff-b545-7fa3cf860b94-tmp-dir\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:50:16.013972 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:16.013876 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-txq6k\" (UniqueName: \"kubernetes.io/projected/66c088e2-be4b-40ff-b545-7fa3cf860b94-kube-api-access-txq6k\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:50:16.690823 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:16.690787 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-6ff7c96cb4-j9987_66c088e2-be4b-40ff-b545-7fa3cf860b94/main/0.log" Apr 28 19:50:16.691258 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:16.691212 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" event={"ID":"66c088e2-be4b-40ff-b545-7fa3cf860b94","Type":"ContainerDied","Data":"cab06e6409cee5613bc1fa781131244bce676beaf44ef6f5e6011b9fbdb643ba"} Apr 28 19:50:16.691329 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:16.691260 2565 scope.go:117] "RemoveContainer" containerID="a646874780a713e02516528938336c088f5a1a56f91dd67b93079306a24ddcf8" Apr 28 19:50:16.691329 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:16.691225 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987" Apr 28 19:50:16.699194 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:16.699174 2565 scope.go:117] "RemoveContainer" containerID="7fbb40f3730c98f1112c7421d5e2ec6a71d7e9928081e5926818e1b7470b746b" Apr 28 19:50:16.711152 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:16.711114 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987"] Apr 28 19:50:16.714079 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:16.714056 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6ff7c96cb4-j9987"] Apr 28 19:50:18.110963 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:18.110926 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66c088e2-be4b-40ff-b545-7fa3cf860b94" path="/var/lib/kubelet/pods/66c088e2-be4b-40ff-b545-7fa3cf860b94/volumes" Apr 28 19:50:50.802679 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:50.802644 2565 generic.go:358] "Generic (PLEG): container finished" podID="1fa649ce-1bc4-47b4-8212-72028c1fed4d" containerID="d8247f01cfde49c575bb83a70e1489b95f670b11800f29149a64f038c6524547" exitCode=0 Apr 28 19:50:50.803083 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:50.802692 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" event={"ID":"1fa649ce-1bc4-47b4-8212-72028c1fed4d","Type":"ContainerDied","Data":"d8247f01cfde49c575bb83a70e1489b95f670b11800f29149a64f038c6524547"} Apr 28 19:50:51.807814 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:51.807777 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" event={"ID":"1fa649ce-1bc4-47b4-8212-72028c1fed4d","Type":"ContainerStarted","Data":"575a90f2c995035da0b79d3f4a9f06ee125197509293c4044831b1d6c8bc7d6f"} Apr 28 19:50:51.831997 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:51.831941 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" podStartSLOduration=57.831923788 podStartE2EDuration="57.831923788s" podCreationTimestamp="2026-04-28 19:49:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:50:51.830004148 +0000 UTC m=+2036.335720813" watchObservedRunningTime="2026-04-28 19:50:51.831923788 +0000 UTC m=+2036.337640455" Apr 28 19:50:55.081187 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:55.081118 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:50:55.081187 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:55.081192 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:50:55.082851 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:55.082822 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" podUID="1fa649ce-1bc4-47b4-8212-72028c1fed4d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 28 19:50:57.966494 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:57.966457 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj"] Apr 28 19:50:57.966864 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:57.966848 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66c088e2-be4b-40ff-b545-7fa3cf860b94" containerName="storage-initializer" Apr 28 19:50:57.966920 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:57.966867 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c088e2-be4b-40ff-b545-7fa3cf860b94" containerName="storage-initializer" Apr 28 19:50:57.966920 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:57.966881 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66c088e2-be4b-40ff-b545-7fa3cf860b94" containerName="main" Apr 28 19:50:57.966920 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:57.966888 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c088e2-be4b-40ff-b545-7fa3cf860b94" containerName="main" Apr 28 19:50:57.967024 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:57.966939 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="66c088e2-be4b-40ff-b545-7fa3cf860b94" containerName="main" Apr 28 19:50:57.970930 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:57.970914 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" Apr 28 19:50:57.973682 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:57.973658 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 28 19:50:57.981102 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:57.981075 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj"] Apr 28 19:50:58.081512 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:58.081481 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-dshm\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-qmqlj\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" Apr 28 19:50:58.081714 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:58.081519 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-qmqlj\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" Apr 28 19:50:58.081714 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:58.081640 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-tls-certs\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-qmqlj\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" Apr 28 19:50:58.081714 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:58.081671 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-home\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-qmqlj\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" Apr 28 19:50:58.081714 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:58.081690 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-tmp-dir\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-qmqlj\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" Apr 28 19:50:58.081932 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:58.081782 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-model-cache\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-qmqlj\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" Apr 28 19:50:58.081932 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:58.081815 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7chlp\" (UniqueName: \"kubernetes.io/projected/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-kube-api-access-7chlp\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-qmqlj\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" Apr 28 19:50:58.182786 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:58.182741 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-tls-certs\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-qmqlj\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" Apr 28 19:50:58.182786 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:58.182788 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-home\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-qmqlj\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" Apr 28 19:50:58.183026 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:58.182927 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-tmp-dir\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-qmqlj\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" Apr 28 19:50:58.183026 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:58.183001 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-model-cache\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-qmqlj\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" Apr 28 19:50:58.183130 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:58.183045 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7chlp\" (UniqueName: \"kubernetes.io/projected/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-kube-api-access-7chlp\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-qmqlj\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" Apr 28 19:50:58.183130 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:58.183123 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-dshm\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-qmqlj\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" Apr 28 19:50:58.183269 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:58.183145 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-home\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-qmqlj\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" Apr 28 19:50:58.183269 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:58.183258 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-tmp-dir\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-qmqlj\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" Apr 28 19:50:58.183443 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:58.183322 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-model-cache\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-qmqlj\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" Apr 28 19:50:58.183501 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:58.183257 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-qmqlj\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" Apr 28 19:50:58.183602 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:58.183578 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-qmqlj\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" Apr 28 19:50:58.185312 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:58.185291 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-dshm\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-qmqlj\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" Apr 28 19:50:58.185732 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:58.185713 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-tls-certs\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-qmqlj\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" Apr 28 19:50:58.190998 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:58.190974 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7chlp\" (UniqueName: \"kubernetes.io/projected/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-kube-api-access-7chlp\") pod \"precise-prefix-cache-test-kserve-55744dbcf4-qmqlj\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" Apr 28 19:50:58.281815 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:58.281782 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" Apr 28 19:50:58.416485 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:58.416447 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj"] Apr 28 19:50:58.419625 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:50:58.419591 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91505c15_3e9d_4ee7_aa43_50c6a176b4bf.slice/crio-d00664422dd072b5a1807dcb41d8c2268cc032168807b1009f465c2ce73eb3ab WatchSource:0}: Error finding container d00664422dd072b5a1807dcb41d8c2268cc032168807b1009f465c2ce73eb3ab: Status 404 returned error can't find the container with id d00664422dd072b5a1807dcb41d8c2268cc032168807b1009f465c2ce73eb3ab Apr 28 19:50:58.836419 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:58.836372 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" event={"ID":"91505c15-3e9d-4ee7-aa43-50c6a176b4bf","Type":"ContainerStarted","Data":"a1cda21c73240d56feff7a9de9a66b7bd908974c57bca1fad25f5ce8e1192d68"} Apr 28 19:50:58.836419 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:50:58.836418 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" event={"ID":"91505c15-3e9d-4ee7-aa43-50c6a176b4bf","Type":"ContainerStarted","Data":"d00664422dd072b5a1807dcb41d8c2268cc032168807b1009f465c2ce73eb3ab"} Apr 28 19:51:05.081416 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:51:05.081353 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" podUID="1fa649ce-1bc4-47b4-8212-72028c1fed4d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 28 19:51:15.081538 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:51:15.081441 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" podUID="1fa649ce-1bc4-47b4-8212-72028c1fed4d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 28 19:51:25.081285 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:51:25.081231 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" podUID="1fa649ce-1bc4-47b4-8212-72028c1fed4d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 28 19:51:35.081830 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:51:35.081784 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" podUID="1fa649ce-1bc4-47b4-8212-72028c1fed4d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 28 19:51:45.080980 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:51:45.080935 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" podUID="1fa649ce-1bc4-47b4-8212-72028c1fed4d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 28 19:51:55.081155 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:51:55.081107 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" podUID="1fa649ce-1bc4-47b4-8212-72028c1fed4d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 28 19:52:05.081241 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:05.081196 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" podUID="1fa649ce-1bc4-47b4-8212-72028c1fed4d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 28 19:52:15.080957 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:15.080905 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" podUID="1fa649ce-1bc4-47b4-8212-72028c1fed4d" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 28 19:52:25.091463 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:25.091421 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:52:25.099583 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:25.099553 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:52:30.901666 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:30.901633 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv"] Apr 28 19:52:30.902264 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:30.902052 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" podUID="1fa649ce-1bc4-47b4-8212-72028c1fed4d" containerName="main" containerID="cri-o://575a90f2c995035da0b79d3f4a9f06ee125197509293c4044831b1d6c8bc7d6f" gracePeriod=30 Apr 28 19:52:49.708652 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.708573 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz"] Apr 28 19:52:49.714959 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.714933 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn"] Apr 28 19:52:49.715128 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.715093 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 19:52:49.718178 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.718144 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 28 19:52:49.718298 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.718150 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-dockercfg-t82fz\"" Apr 28 19:52:49.719083 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.719062 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 19:52:49.725502 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.725444 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz"] Apr 28 19:52:49.732836 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.732814 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn"] Apr 28 19:52:49.896845 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.896804 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 19:52:49.896845 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.896848 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cc21cda1-2baf-45c3-9e09-d68413fa00b4-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 19:52:49.897097 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.896909 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwkfr\" (UniqueName: \"kubernetes.io/projected/96bfeafd-b753-41f4-bd7d-2008b55dfd13-kube-api-access-jwkfr\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 19:52:49.897097 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.897018 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 19:52:49.897097 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.897052 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 19:52:49.897097 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.897088 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 19:52:49.897340 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.897125 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 19:52:49.897340 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.897142 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 19:52:49.897340 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.897224 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96bfeafd-b753-41f4-bd7d-2008b55dfd13-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 19:52:49.897340 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.897254 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 19:52:49.897340 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.897270 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 19:52:49.897340 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.897330 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 19:52:49.897651 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.897350 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl52k\" (UniqueName: \"kubernetes.io/projected/cc21cda1-2baf-45c3-9e09-d68413fa00b4-kube-api-access-fl52k\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 19:52:49.897651 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.897402 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 19:52:49.998661 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.998580 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 19:52:49.998907 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.998882 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 19:52:49.999032 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.998924 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cc21cda1-2baf-45c3-9e09-d68413fa00b4-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 19:52:49.999032 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.998956 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwkfr\" (UniqueName: \"kubernetes.io/projected/96bfeafd-b753-41f4-bd7d-2008b55dfd13-kube-api-access-jwkfr\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 19:52:49.999032 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.999009 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 19:52:49.999217 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.999030 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 19:52:49.999217 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.999041 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 19:52:49.999217 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.999101 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 19:52:49.999217 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.999149 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 19:52:49.999217 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.999193 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 19:52:49.999477 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.999236 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96bfeafd-b753-41f4-bd7d-2008b55dfd13-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 19:52:49.999477 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.999277 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 19:52:49.999477 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.999304 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 19:52:49.999477 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.999336 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 19:52:49.999477 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.999367 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fl52k\" (UniqueName: \"kubernetes.io/projected/cc21cda1-2baf-45c3-9e09-d68413fa00b4-kube-api-access-fl52k\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 19:52:49.999739 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.999507 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 19:52:49.999739 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.999539 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 19:52:49.999843 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.999757 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 19:52:49.999900 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.999847 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 19:52:50.000052 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:49.999994 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 19:52:50.000130 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:50.000117 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 19:52:50.000207 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:50.000181 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 19:52:50.002386 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:50.002358 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 19:52:50.002495 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:50.002462 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96bfeafd-b753-41f4-bd7d-2008b55dfd13-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 19:52:50.002558 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:50.002504 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cc21cda1-2baf-45c3-9e09-d68413fa00b4-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 19:52:50.002666 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:50.002646 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 19:52:50.018671 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:50.018647 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl52k\" (UniqueName: \"kubernetes.io/projected/cc21cda1-2baf-45c3-9e09-d68413fa00b4-kube-api-access-fl52k\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 19:52:50.018814 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:50.018796 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwkfr\" (UniqueName: \"kubernetes.io/projected/96bfeafd-b753-41f4-bd7d-2008b55dfd13-kube-api-access-jwkfr\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 19:52:50.028805 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:50.028780 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 19:52:50.036611 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:50.036578 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 19:52:50.175970 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:50.175938 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz"] Apr 28 19:52:50.178282 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:52:50.178247 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc21cda1_2baf_45c3_9e09_d68413fa00b4.slice/crio-4d7be9913a8cb1f508d43cb9d1dcad11f809e3f21478a4f10b3644906d4eae2b WatchSource:0}: Error finding container 4d7be9913a8cb1f508d43cb9d1dcad11f809e3f21478a4f10b3644906d4eae2b: Status 404 returned error can't find the container with id 4d7be9913a8cb1f508d43cb9d1dcad11f809e3f21478a4f10b3644906d4eae2b Apr 28 19:52:50.235248 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:50.235213 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" event={"ID":"cc21cda1-2baf-45c3-9e09-d68413fa00b4","Type":"ContainerStarted","Data":"4d7be9913a8cb1f508d43cb9d1dcad11f809e3f21478a4f10b3644906d4eae2b"} Apr 28 19:52:50.440369 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:50.440334 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn"] Apr 28 19:52:50.443607 ip-10-0-141-41 kubenswrapper[2565]: W0428 19:52:50.443568 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96bfeafd_b753_41f4_bd7d_2008b55dfd13.slice/crio-b1cbad9b19478d3689fd2d97dc3f5ca201aa8645335846f25dbe73f11d238e6e WatchSource:0}: Error finding container b1cbad9b19478d3689fd2d97dc3f5ca201aa8645335846f25dbe73f11d238e6e: Status 404 returned error can't find the container with id b1cbad9b19478d3689fd2d97dc3f5ca201aa8645335846f25dbe73f11d238e6e Apr 28 19:52:51.240044 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:51.240006 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" event={"ID":"96bfeafd-b753-41f4-bd7d-2008b55dfd13","Type":"ContainerStarted","Data":"7140e0c6009495f6e6322564c6010d27bd8a920d1593fb18b7d2ac80b07bfe01"} Apr 28 19:52:51.240044 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:51.240047 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" event={"ID":"96bfeafd-b753-41f4-bd7d-2008b55dfd13","Type":"ContainerStarted","Data":"b1cbad9b19478d3689fd2d97dc3f5ca201aa8645335846f25dbe73f11d238e6e"} Apr 28 19:52:51.241412 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:51.241380 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" event={"ID":"cc21cda1-2baf-45c3-9e09-d68413fa00b4","Type":"ContainerStarted","Data":"cce42369c4bde871de2b2312c42713320104202ae411f70816adc2439077bf7e"} Apr 28 19:52:51.241558 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:51.241493 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 19:52:52.247851 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:52.247811 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" event={"ID":"cc21cda1-2baf-45c3-9e09-d68413fa00b4","Type":"ContainerStarted","Data":"d986bd2ca47266e9574628c4e983987d51a1698c1d58d066ce63c68e1b4b6e6f"} Apr 28 19:52:55.258747 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:55.258716 2565 generic.go:358] "Generic (PLEG): container finished" podID="96bfeafd-b753-41f4-bd7d-2008b55dfd13" containerID="7140e0c6009495f6e6322564c6010d27bd8a920d1593fb18b7d2ac80b07bfe01" exitCode=0 Apr 28 19:52:55.259181 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:55.258797 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" event={"ID":"96bfeafd-b753-41f4-bd7d-2008b55dfd13","Type":"ContainerDied","Data":"7140e0c6009495f6e6322564c6010d27bd8a920d1593fb18b7d2ac80b07bfe01"} Apr 28 19:52:56.264018 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:56.263981 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" event={"ID":"96bfeafd-b753-41f4-bd7d-2008b55dfd13","Type":"ContainerStarted","Data":"fe01014b4b7508fd4640a01ddddd5a017f09024f68ce03533ac7abd8938d920c"} Apr 28 19:52:56.291698 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:52:56.291647 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" podStartSLOduration=7.291512182 podStartE2EDuration="7.291512182s" podCreationTimestamp="2026-04-28 19:52:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:52:56.289447957 +0000 UTC m=+2160.795164622" watchObservedRunningTime="2026-04-28 19:52:56.291512182 +0000 UTC m=+2160.797228849" Apr 28 19:53:00.037042 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:00.037005 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 19:53:00.037511 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:00.037056 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 19:53:00.038176 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:00.038130 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" podUID="96bfeafd-b753-41f4-bd7d-2008b55dfd13" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 28 19:53:01.174075 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.174048 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-54946d576-jgddv_1fa649ce-1bc4-47b4-8212-72028c1fed4d/main/0.log" Apr 28 19:53:01.174455 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.174435 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:53:01.207778 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.207743 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-kserve-provision-location\") pod \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " Apr 28 19:53:01.207915 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.207790 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-home\") pod \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " Apr 28 19:53:01.207915 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.207837 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-model-cache\") pod \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " Apr 28 19:53:01.207915 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.207900 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-tmp-dir\") pod \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " Apr 28 19:53:01.208074 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.207927 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1fa649ce-1bc4-47b4-8212-72028c1fed4d-tls-certs\") pod \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " Apr 28 19:53:01.208074 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.207966 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lddh2\" (UniqueName: \"kubernetes.io/projected/1fa649ce-1bc4-47b4-8212-72028c1fed4d-kube-api-access-lddh2\") pod \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " Apr 28 19:53:01.208074 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.208029 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-dshm\") pod \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\" (UID: \"1fa649ce-1bc4-47b4-8212-72028c1fed4d\") " Apr 28 19:53:01.208580 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.208410 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-model-cache" (OuterVolumeSpecName: "model-cache") pod "1fa649ce-1bc4-47b4-8212-72028c1fed4d" (UID: "1fa649ce-1bc4-47b4-8212-72028c1fed4d"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:53:01.209484 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.209451 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-home" (OuterVolumeSpecName: "home") pod "1fa649ce-1bc4-47b4-8212-72028c1fed4d" (UID: "1fa649ce-1bc4-47b4-8212-72028c1fed4d"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:53:01.211579 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.211544 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fa649ce-1bc4-47b4-8212-72028c1fed4d-kube-api-access-lddh2" (OuterVolumeSpecName: "kube-api-access-lddh2") pod "1fa649ce-1bc4-47b4-8212-72028c1fed4d" (UID: "1fa649ce-1bc4-47b4-8212-72028c1fed4d"). InnerVolumeSpecName "kube-api-access-lddh2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:53:01.211847 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.211818 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fa649ce-1bc4-47b4-8212-72028c1fed4d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1fa649ce-1bc4-47b4-8212-72028c1fed4d" (UID: "1fa649ce-1bc4-47b4-8212-72028c1fed4d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:53:01.211916 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.211870 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-dshm" (OuterVolumeSpecName: "dshm") pod "1fa649ce-1bc4-47b4-8212-72028c1fed4d" (UID: "1fa649ce-1bc4-47b4-8212-72028c1fed4d"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:53:01.219951 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.219924 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "1fa649ce-1bc4-47b4-8212-72028c1fed4d" (UID: "1fa649ce-1bc4-47b4-8212-72028c1fed4d"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:53:01.275375 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.275326 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1fa649ce-1bc4-47b4-8212-72028c1fed4d" (UID: "1fa649ce-1bc4-47b4-8212-72028c1fed4d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:53:01.282937 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.282915 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-54946d576-jgddv_1fa649ce-1bc4-47b4-8212-72028c1fed4d/main/0.log" Apr 28 19:53:01.283314 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.283289 2565 generic.go:358] "Generic (PLEG): container finished" podID="1fa649ce-1bc4-47b4-8212-72028c1fed4d" containerID="575a90f2c995035da0b79d3f4a9f06ee125197509293c4044831b1d6c8bc7d6f" exitCode=137 Apr 28 19:53:01.283375 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.283352 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" event={"ID":"1fa649ce-1bc4-47b4-8212-72028c1fed4d","Type":"ContainerDied","Data":"575a90f2c995035da0b79d3f4a9f06ee125197509293c4044831b1d6c8bc7d6f"} Apr 28 19:53:01.283422 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.283374 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" event={"ID":"1fa649ce-1bc4-47b4-8212-72028c1fed4d","Type":"ContainerDied","Data":"902e830ddeac3ed78dce6100ab29b8fbf4d5c3420cb5b58ed79e1aa1d70c664b"} Apr 28 19:53:01.283422 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.283378 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv" Apr 28 19:53:01.283496 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.283388 2565 scope.go:117] "RemoveContainer" containerID="575a90f2c995035da0b79d3f4a9f06ee125197509293c4044831b1d6c8bc7d6f" Apr 28 19:53:01.291852 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.291832 2565 scope.go:117] "RemoveContainer" containerID="d8247f01cfde49c575bb83a70e1489b95f670b11800f29149a64f038c6524547" Apr 28 19:53:01.309323 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.309296 2565 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-tmp-dir\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:53:01.309323 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.309326 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1fa649ce-1bc4-47b4-8212-72028c1fed4d-tls-certs\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:53:01.309492 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.309340 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lddh2\" (UniqueName: \"kubernetes.io/projected/1fa649ce-1bc4-47b4-8212-72028c1fed4d-kube-api-access-lddh2\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:53:01.309492 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.309353 2565 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-dshm\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:53:01.309492 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.309370 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-kserve-provision-location\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:53:01.309492 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.309383 2565 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-home\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:53:01.309492 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.309396 2565 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1fa649ce-1bc4-47b4-8212-72028c1fed4d-model-cache\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 19:53:01.311536 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.311515 2565 scope.go:117] "RemoveContainer" containerID="575a90f2c995035da0b79d3f4a9f06ee125197509293c4044831b1d6c8bc7d6f" Apr 28 19:53:01.311881 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:53:01.311821 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"575a90f2c995035da0b79d3f4a9f06ee125197509293c4044831b1d6c8bc7d6f\": container with ID starting with 575a90f2c995035da0b79d3f4a9f06ee125197509293c4044831b1d6c8bc7d6f not found: ID does not exist" containerID="575a90f2c995035da0b79d3f4a9f06ee125197509293c4044831b1d6c8bc7d6f" Apr 28 19:53:01.312038 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.311861 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"575a90f2c995035da0b79d3f4a9f06ee125197509293c4044831b1d6c8bc7d6f"} err="failed to get container status \"575a90f2c995035da0b79d3f4a9f06ee125197509293c4044831b1d6c8bc7d6f\": rpc error: code = NotFound desc = could not find container \"575a90f2c995035da0b79d3f4a9f06ee125197509293c4044831b1d6c8bc7d6f\": container with ID starting with 575a90f2c995035da0b79d3f4a9f06ee125197509293c4044831b1d6c8bc7d6f not found: ID does not exist" Apr 28 19:53:01.312038 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.311934 2565 scope.go:117] "RemoveContainer" containerID="d8247f01cfde49c575bb83a70e1489b95f670b11800f29149a64f038c6524547" Apr 28 19:53:01.312327 ip-10-0-141-41 kubenswrapper[2565]: E0428 19:53:01.312305 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8247f01cfde49c575bb83a70e1489b95f670b11800f29149a64f038c6524547\": container with ID starting with d8247f01cfde49c575bb83a70e1489b95f670b11800f29149a64f038c6524547 not found: ID does not exist" containerID="d8247f01cfde49c575bb83a70e1489b95f670b11800f29149a64f038c6524547" Apr 28 19:53:01.312475 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.312443 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8247f01cfde49c575bb83a70e1489b95f670b11800f29149a64f038c6524547"} err="failed to get container status \"d8247f01cfde49c575bb83a70e1489b95f670b11800f29149a64f038c6524547\": rpc error: code = NotFound desc = could not find container \"d8247f01cfde49c575bb83a70e1489b95f670b11800f29149a64f038c6524547\": container with ID starting with d8247f01cfde49c575bb83a70e1489b95f670b11800f29149a64f038c6524547 not found: ID does not exist" Apr 28 19:53:01.312680 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.312650 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv"] Apr 28 19:53:01.314860 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:01.314841 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-54946d576-jgddv"] Apr 28 19:53:02.111789 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:02.111756 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fa649ce-1bc4-47b4-8212-72028c1fed4d" path="/var/lib/kubelet/pods/1fa649ce-1bc4-47b4-8212-72028c1fed4d/volumes" Apr 28 19:53:03.264463 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:03.264434 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 19:53:10.037527 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:10.037482 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" podUID="96bfeafd-b753-41f4-bd7d-2008b55dfd13" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 28 19:53:20.037177 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:20.037112 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" podUID="96bfeafd-b753-41f4-bd7d-2008b55dfd13" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 28 19:53:30.037288 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:30.037234 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" podUID="96bfeafd-b753-41f4-bd7d-2008b55dfd13" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 28 19:53:40.037387 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:40.037336 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" podUID="96bfeafd-b753-41f4-bd7d-2008b55dfd13" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 28 19:53:50.037205 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:53:50.037133 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" podUID="96bfeafd-b753-41f4-bd7d-2008b55dfd13" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 28 19:54:00.037408 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:54:00.037348 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" podUID="96bfeafd-b753-41f4-bd7d-2008b55dfd13" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 28 19:54:10.037848 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:54:10.037804 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" podUID="96bfeafd-b753-41f4-bd7d-2008b55dfd13" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 28 19:54:20.037210 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:54:20.037145 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" podUID="96bfeafd-b753-41f4-bd7d-2008b55dfd13" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 28 19:54:30.047427 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:54:30.047394 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 19:54:30.055140 ip-10-0-141-41 kubenswrapper[2565]: I0428 19:54:30.055114 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 20:05:34.875686 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:34.875647 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj"] Apr 28 20:05:34.876309 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:34.875964 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" podUID="91505c15-3e9d-4ee7-aa43-50c6a176b4bf" containerName="storage-initializer" containerID="cri-o://a1cda21c73240d56feff7a9de9a66b7bd908974c57bca1fad25f5ce8e1192d68" gracePeriod=30 Apr 28 20:05:54.977773 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:54.977741 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p"] Apr 28 20:05:54.978244 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:54.978114 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1fa649ce-1bc4-47b4-8212-72028c1fed4d" containerName="main" Apr 28 20:05:54.978244 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:54.978125 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa649ce-1bc4-47b4-8212-72028c1fed4d" containerName="main" Apr 28 20:05:54.978244 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:54.978139 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1fa649ce-1bc4-47b4-8212-72028c1fed4d" containerName="storage-initializer" Apr 28 20:05:54.978244 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:54.978145 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa649ce-1bc4-47b4-8212-72028c1fed4d" containerName="storage-initializer" Apr 28 20:05:54.978244 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:54.978242 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="1fa649ce-1bc4-47b4-8212-72028c1fed4d" containerName="main" Apr 28 20:05:54.981389 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:54.981371 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:05:54.984110 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:54.984090 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 28 20:05:54.994354 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:54.994331 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p"] Apr 28 20:05:55.091687 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:55.091644 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-tmp-dir\") pod \"stop-feature-test-kserve-76b8b9b84-fms4p\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:05:55.091851 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:55.091759 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-model-cache\") pod \"stop-feature-test-kserve-76b8b9b84-fms4p\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:05:55.091851 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:55.091800 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-home\") pod \"stop-feature-test-kserve-76b8b9b84-fms4p\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:05:55.091851 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:55.091824 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-dshm\") pod \"stop-feature-test-kserve-76b8b9b84-fms4p\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:05:55.091983 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:55.091858 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1bbd05f9-5b60-4859-9805-7f5481687c38-tls-certs\") pod \"stop-feature-test-kserve-76b8b9b84-fms4p\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:05:55.091983 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:55.091894 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrpcv\" (UniqueName: \"kubernetes.io/projected/1bbd05f9-5b60-4859-9805-7f5481687c38-kube-api-access-qrpcv\") pod \"stop-feature-test-kserve-76b8b9b84-fms4p\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:05:55.091983 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:55.091926 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-kserve-provision-location\") pod \"stop-feature-test-kserve-76b8b9b84-fms4p\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:05:55.192638 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:55.192598 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-dshm\") pod \"stop-feature-test-kserve-76b8b9b84-fms4p\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:05:55.192809 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:55.192657 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1bbd05f9-5b60-4859-9805-7f5481687c38-tls-certs\") pod \"stop-feature-test-kserve-76b8b9b84-fms4p\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:05:55.192809 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:55.192687 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrpcv\" (UniqueName: \"kubernetes.io/projected/1bbd05f9-5b60-4859-9805-7f5481687c38-kube-api-access-qrpcv\") pod \"stop-feature-test-kserve-76b8b9b84-fms4p\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:05:55.192809 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:55.192726 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-kserve-provision-location\") pod \"stop-feature-test-kserve-76b8b9b84-fms4p\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:05:55.192809 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:55.192786 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-tmp-dir\") pod \"stop-feature-test-kserve-76b8b9b84-fms4p\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:05:55.193017 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:55.192852 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-model-cache\") pod \"stop-feature-test-kserve-76b8b9b84-fms4p\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:05:55.193017 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:55.192896 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-home\") pod \"stop-feature-test-kserve-76b8b9b84-fms4p\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:05:55.193224 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:55.193195 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-tmp-dir\") pod \"stop-feature-test-kserve-76b8b9b84-fms4p\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:05:55.193312 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:55.193242 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-kserve-provision-location\") pod \"stop-feature-test-kserve-76b8b9b84-fms4p\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:05:55.193312 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:55.193282 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-home\") pod \"stop-feature-test-kserve-76b8b9b84-fms4p\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:05:55.193382 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:55.193356 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-model-cache\") pod \"stop-feature-test-kserve-76b8b9b84-fms4p\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:05:55.195121 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:55.195092 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-dshm\") pod \"stop-feature-test-kserve-76b8b9b84-fms4p\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:05:55.195256 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:55.195232 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1bbd05f9-5b60-4859-9805-7f5481687c38-tls-certs\") pod \"stop-feature-test-kserve-76b8b9b84-fms4p\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:05:55.200902 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:55.200877 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrpcv\" (UniqueName: \"kubernetes.io/projected/1bbd05f9-5b60-4859-9805-7f5481687c38-kube-api-access-qrpcv\") pod \"stop-feature-test-kserve-76b8b9b84-fms4p\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:05:55.291842 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:55.291809 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:05:55.631545 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:55.631392 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p"] Apr 28 20:05:55.633742 ip-10-0-141-41 kubenswrapper[2565]: W0428 20:05:55.633703 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bbd05f9_5b60_4859_9805_7f5481687c38.slice/crio-876b0acf80a80121aafb7c9140c1d27d7c42991cbef91bd8eb7a2c1135edd9ab WatchSource:0}: Error finding container 876b0acf80a80121aafb7c9140c1d27d7c42991cbef91bd8eb7a2c1135edd9ab: Status 404 returned error can't find the container with id 876b0acf80a80121aafb7c9140c1d27d7c42991cbef91bd8eb7a2c1135edd9ab Apr 28 20:05:55.635594 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:55.635574 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 20:05:55.936115 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:55.936026 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" event={"ID":"1bbd05f9-5b60-4859-9805-7f5481687c38","Type":"ContainerStarted","Data":"e0f0db98f42e54a0fee30e06323db2aaf9c9a90330a45952e0592b002c56c08a"} Apr 28 20:05:55.936115 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:05:55.936064 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" event={"ID":"1bbd05f9-5b60-4859-9805-7f5481687c38","Type":"ContainerStarted","Data":"876b0acf80a80121aafb7c9140c1d27d7c42991cbef91bd8eb7a2c1135edd9ab"} Apr 28 20:06:04.966696 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:04.966641 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_precise-prefix-cache-test-kserve-55744dbcf4-qmqlj_91505c15-3e9d-4ee7-aa43-50c6a176b4bf/storage-initializer/0.log" Apr 28 20:06:04.967200 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:04.966705 2565 generic.go:358] "Generic (PLEG): container finished" podID="91505c15-3e9d-4ee7-aa43-50c6a176b4bf" containerID="a1cda21c73240d56feff7a9de9a66b7bd908974c57bca1fad25f5ce8e1192d68" exitCode=137 Apr 28 20:06:04.967200 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:04.966790 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" event={"ID":"91505c15-3e9d-4ee7-aa43-50c6a176b4bf","Type":"ContainerDied","Data":"a1cda21c73240d56feff7a9de9a66b7bd908974c57bca1fad25f5ce8e1192d68"} Apr 28 20:06:05.071771 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:05.071745 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_precise-prefix-cache-test-kserve-55744dbcf4-qmqlj_91505c15-3e9d-4ee7-aa43-50c6a176b4bf/storage-initializer/0.log" Apr 28 20:06:05.071891 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:05.071808 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" Apr 28 20:06:05.196881 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:05.196795 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-tmp-dir\") pod \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " Apr 28 20:06:05.196881 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:05.196841 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-kserve-provision-location\") pod \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " Apr 28 20:06:05.197057 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:05.196889 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-tls-certs\") pod \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " Apr 28 20:06:05.197057 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:05.196905 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-model-cache\") pod \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " Apr 28 20:06:05.197057 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:05.196974 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7chlp\" (UniqueName: \"kubernetes.io/projected/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-kube-api-access-7chlp\") pod \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " Apr 28 20:06:05.197057 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:05.197025 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-dshm\") pod \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " Apr 28 20:06:05.197057 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:05.197050 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-home\") pod \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\" (UID: \"91505c15-3e9d-4ee7-aa43-50c6a176b4bf\") " Apr 28 20:06:05.197330 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:05.197047 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "91505c15-3e9d-4ee7-aa43-50c6a176b4bf" (UID: "91505c15-3e9d-4ee7-aa43-50c6a176b4bf"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:06:05.197330 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:05.197246 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-model-cache" (OuterVolumeSpecName: "model-cache") pod "91505c15-3e9d-4ee7-aa43-50c6a176b4bf" (UID: "91505c15-3e9d-4ee7-aa43-50c6a176b4bf"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:06:05.197452 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:05.197426 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-home" (OuterVolumeSpecName: "home") pod "91505c15-3e9d-4ee7-aa43-50c6a176b4bf" (UID: "91505c15-3e9d-4ee7-aa43-50c6a176b4bf"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:06:05.197611 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:05.197577 2565 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-home\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:06:05.197611 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:05.197605 2565 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-tmp-dir\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:06:05.197730 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:05.197621 2565 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-model-cache\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:06:05.199155 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:05.199125 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-dshm" (OuterVolumeSpecName: "dshm") pod "91505c15-3e9d-4ee7-aa43-50c6a176b4bf" (UID: "91505c15-3e9d-4ee7-aa43-50c6a176b4bf"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:06:05.199276 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:05.199229 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-kube-api-access-7chlp" (OuterVolumeSpecName: "kube-api-access-7chlp") pod "91505c15-3e9d-4ee7-aa43-50c6a176b4bf" (UID: "91505c15-3e9d-4ee7-aa43-50c6a176b4bf"). InnerVolumeSpecName "kube-api-access-7chlp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:06:05.199276 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:05.199231 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "91505c15-3e9d-4ee7-aa43-50c6a176b4bf" (UID: "91505c15-3e9d-4ee7-aa43-50c6a176b4bf"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:06:05.219883 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:05.219860 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "91505c15-3e9d-4ee7-aa43-50c6a176b4bf" (UID: "91505c15-3e9d-4ee7-aa43-50c6a176b4bf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:06:05.298839 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:05.298807 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-kserve-provision-location\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:06:05.298839 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:05.298834 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-tls-certs\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:06:05.298839 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:05.298846 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7chlp\" (UniqueName: \"kubernetes.io/projected/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-kube-api-access-7chlp\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:06:05.299031 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:05.298857 2565 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/91505c15-3e9d-4ee7-aa43-50c6a176b4bf-dshm\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:06:05.973291 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:05.973265 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_precise-prefix-cache-test-kserve-55744dbcf4-qmqlj_91505c15-3e9d-4ee7-aa43-50c6a176b4bf/storage-initializer/0.log" Apr 28 20:06:05.973735 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:05.973388 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" Apr 28 20:06:05.973735 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:05.973398 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj" event={"ID":"91505c15-3e9d-4ee7-aa43-50c6a176b4bf","Type":"ContainerDied","Data":"d00664422dd072b5a1807dcb41d8c2268cc032168807b1009f465c2ce73eb3ab"} Apr 28 20:06:05.973735 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:05.973445 2565 scope.go:117] "RemoveContainer" containerID="a1cda21c73240d56feff7a9de9a66b7bd908974c57bca1fad25f5ce8e1192d68" Apr 28 20:06:06.010601 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:06.010561 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj"] Apr 28 20:06:06.013414 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:06.013388 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-55744dbcf4-qmqlj"] Apr 28 20:06:06.111067 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:06:06.111033 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91505c15-3e9d-4ee7-aa43-50c6a176b4bf" path="/var/lib/kubelet/pods/91505c15-3e9d-4ee7-aa43-50c6a176b4bf/volumes" Apr 28 20:07:00.165960 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:00.165925 2565 generic.go:358] "Generic (PLEG): container finished" podID="1bbd05f9-5b60-4859-9805-7f5481687c38" containerID="e0f0db98f42e54a0fee30e06323db2aaf9c9a90330a45952e0592b002c56c08a" exitCode=0 Apr 28 20:07:00.166412 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:00.166002 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" event={"ID":"1bbd05f9-5b60-4859-9805-7f5481687c38","Type":"ContainerDied","Data":"e0f0db98f42e54a0fee30e06323db2aaf9c9a90330a45952e0592b002c56c08a"} Apr 28 20:07:01.172340 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:01.172301 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" event={"ID":"1bbd05f9-5b60-4859-9805-7f5481687c38","Type":"ContainerStarted","Data":"ed07590a7beab4a0749f49f3fd92f6895257eaf0fab5471736575b4952ddd4dd"} Apr 28 20:07:01.199363 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:01.199306 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" podStartSLOduration=67.199285787 podStartE2EDuration="1m7.199285787s" podCreationTimestamp="2026-04-28 20:05:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:07:01.196591051 +0000 UTC m=+3005.702307718" watchObservedRunningTime="2026-04-28 20:07:01.199285787 +0000 UTC m=+3005.705002453" Apr 28 20:07:05.292708 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:05.292672 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:07:05.292708 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:05.292706 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:07:05.294440 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:05.294415 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" podUID="1bbd05f9-5b60-4859-9805-7f5481687c38" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 28 20:07:15.292548 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:15.292489 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" podUID="1bbd05f9-5b60-4859-9805-7f5481687c38" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 28 20:07:25.293122 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:25.293077 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" podUID="1bbd05f9-5b60-4859-9805-7f5481687c38" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 28 20:07:33.892107 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:33.892069 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz"] Apr 28 20:07:33.892730 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:33.892426 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" podUID="cc21cda1-2baf-45c3-9e09-d68413fa00b4" containerName="llm-d-routing-sidecar" containerID="cri-o://cce42369c4bde871de2b2312c42713320104202ae411f70816adc2439077bf7e" gracePeriod=30 Apr 28 20:07:33.892730 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:33.892447 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" podUID="cc21cda1-2baf-45c3-9e09-d68413fa00b4" containerName="storage-initializer" containerID="cri-o://d986bd2ca47266e9574628c4e983987d51a1698c1d58d066ce63c68e1b4b6e6f" gracePeriod=30 Apr 28 20:07:33.897429 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:33.897400 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn"] Apr 28 20:07:33.897702 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:33.897676 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" podUID="96bfeafd-b753-41f4-bd7d-2008b55dfd13" containerName="main" containerID="cri-o://fe01014b4b7508fd4640a01ddddd5a017f09024f68ce03533ac7abd8938d920c" gracePeriod=30 Apr 28 20:07:34.289084 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:34.289053 2565 generic.go:358] "Generic (PLEG): container finished" podID="cc21cda1-2baf-45c3-9e09-d68413fa00b4" containerID="cce42369c4bde871de2b2312c42713320104202ae411f70816adc2439077bf7e" exitCode=0 Apr 28 20:07:34.289261 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:34.289121 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" event={"ID":"cc21cda1-2baf-45c3-9e09-d68413fa00b4","Type":"ContainerDied","Data":"cce42369c4bde871de2b2312c42713320104202ae411f70816adc2439077bf7e"} Apr 28 20:07:35.292533 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:35.292479 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" podUID="1bbd05f9-5b60-4859-9805-7f5481687c38" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 28 20:07:40.029724 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:40.029663 2565 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" podUID="cc21cda1-2baf-45c3-9e09-d68413fa00b4" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 28 20:07:43.251601 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:43.251554 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" podUID="cc21cda1-2baf-45c3-9e09-d68413fa00b4" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 28 20:07:45.292855 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:45.292812 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" podUID="1bbd05f9-5b60-4859-9805-7f5481687c38" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 28 20:07:49.127153 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.127105 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc"] Apr 28 20:07:49.127732 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.127707 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91505c15-3e9d-4ee7-aa43-50c6a176b4bf" containerName="storage-initializer" Apr 28 20:07:49.127820 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.127737 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="91505c15-3e9d-4ee7-aa43-50c6a176b4bf" containerName="storage-initializer" Apr 28 20:07:49.127897 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.127854 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="91505c15-3e9d-4ee7-aa43-50c6a176b4bf" containerName="storage-initializer" Apr 28 20:07:49.131874 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.131845 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:07:49.132972 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.132946 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p"] Apr 28 20:07:49.134511 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.134487 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 28 20:07:49.134778 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.134760 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-pfndn\"" Apr 28 20:07:49.136445 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.136425 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:07:49.148449 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.148426 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc"] Apr 28 20:07:49.150737 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.150714 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p"] Apr 28 20:07:49.268064 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.268030 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e040453-eac7-4f57-97d1-7a73dd2144f0-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:07:49.268286 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.268072 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-dshm\") pod \"custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:07:49.268286 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.268095 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/03d91aea-3f10-4501-93f1-7581e0a15fa6-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:07:49.268286 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.268190 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmlm4\" (UniqueName: \"kubernetes.io/projected/03d91aea-3f10-4501-93f1-7581e0a15fa6-kube-api-access-cmlm4\") pod \"custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:07:49.268286 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.268230 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:07:49.268466 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.268331 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-home\") pod \"custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:07:49.268466 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.268376 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s72hg\" (UniqueName: \"kubernetes.io/projected/2e040453-eac7-4f57-97d1-7a73dd2144f0-kube-api-access-s72hg\") pod \"custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:07:49.268466 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.268412 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:07:49.268734 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.268530 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:07:49.268734 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.268571 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:07:49.268734 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.268664 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:07:49.268734 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.268706 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:07:49.268897 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.268743 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-model-cache\") pod \"custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:07:49.268897 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.268828 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:07:49.369921 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.369885 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-model-cache\") pod \"custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:07:49.370119 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.369935 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:07:49.370119 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.369959 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e040453-eac7-4f57-97d1-7a73dd2144f0-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:07:49.370119 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.369979 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-dshm\") pod \"custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:07:49.370119 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.369999 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/03d91aea-3f10-4501-93f1-7581e0a15fa6-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:07:49.370119 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.370021 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmlm4\" (UniqueName: \"kubernetes.io/projected/03d91aea-3f10-4501-93f1-7581e0a15fa6-kube-api-access-cmlm4\") pod \"custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:07:49.370119 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.370044 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:07:49.370119 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.370088 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-home\") pod \"custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:07:49.370526 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.370112 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s72hg\" (UniqueName: \"kubernetes.io/projected/2e040453-eac7-4f57-97d1-7a73dd2144f0-kube-api-access-s72hg\") pod \"custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:07:49.370526 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.370150 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:07:49.370526 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.370269 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:07:49.370526 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.370308 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:07:49.370526 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.370357 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:07:49.370526 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.370383 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-model-cache\") pod \"custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:07:49.370526 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.370396 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:07:49.370526 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.370383 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:07:49.370921 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.370729 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:07:49.370921 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.370736 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:07:49.370921 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.370836 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-home\") pod \"custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:07:49.371132 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.371103 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:07:49.371229 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.371130 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:07:49.371229 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.371151 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:07:49.372618 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.372585 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-dshm\") pod \"custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:07:49.373217 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.373147 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:07:49.373217 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.373203 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/03d91aea-3f10-4501-93f1-7581e0a15fa6-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:07:49.373868 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.373847 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e040453-eac7-4f57-97d1-7a73dd2144f0-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:07:49.378473 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.378415 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s72hg\" (UniqueName: \"kubernetes.io/projected/2e040453-eac7-4f57-97d1-7a73dd2144f0-kube-api-access-s72hg\") pod \"custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:07:49.378473 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.378429 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmlm4\" (UniqueName: \"kubernetes.io/projected/03d91aea-3f10-4501-93f1-7581e0a15fa6-kube-api-access-cmlm4\") pod \"custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:07:49.447923 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.447873 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:07:49.458126 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.458079 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:07:49.605094 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.605021 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc"] Apr 28 20:07:49.609411 ip-10-0-141-41 kubenswrapper[2565]: W0428 20:07:49.609369 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03d91aea_3f10_4501_93f1_7581e0a15fa6.slice/crio-dc61581f24c6fb298de12cb51ac8fbbec476a62af1059dc367a64f311f345ba7 WatchSource:0}: Error finding container dc61581f24c6fb298de12cb51ac8fbbec476a62af1059dc367a64f311f345ba7: Status 404 returned error can't find the container with id dc61581f24c6fb298de12cb51ac8fbbec476a62af1059dc367a64f311f345ba7 Apr 28 20:07:49.624187 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:49.624130 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p"] Apr 28 20:07:49.628640 ip-10-0-141-41 kubenswrapper[2565]: W0428 20:07:49.628577 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e040453_eac7_4f57_97d1_7a73dd2144f0.slice/crio-7faeddf316995dc0e945f0fbebb72b9adac5454f52e8086b8677bbe5af1e5e6c WatchSource:0}: Error finding container 7faeddf316995dc0e945f0fbebb72b9adac5454f52e8086b8677bbe5af1e5e6c: Status 404 returned error can't find the container with id 7faeddf316995dc0e945f0fbebb72b9adac5454f52e8086b8677bbe5af1e5e6c Apr 28 20:07:50.029543 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:50.029491 2565 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" podUID="cc21cda1-2baf-45c3-9e09-d68413fa00b4" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 28 20:07:50.353084 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:50.352989 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" event={"ID":"2e040453-eac7-4f57-97d1-7a73dd2144f0","Type":"ContainerStarted","Data":"631d45316582157b2408ef0a28e83b233871944683684f4e27cc64c6288c37f1"} Apr 28 20:07:50.353084 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:50.353031 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" event={"ID":"2e040453-eac7-4f57-97d1-7a73dd2144f0","Type":"ContainerStarted","Data":"7faeddf316995dc0e945f0fbebb72b9adac5454f52e8086b8677bbe5af1e5e6c"} Apr 28 20:07:50.354481 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:50.354455 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" event={"ID":"03d91aea-3f10-4501-93f1-7581e0a15fa6","Type":"ContainerStarted","Data":"1561d4545b086e7d5a7cf9b969629a4159c1c0065eee24c3a2d254b476d9206c"} Apr 28 20:07:50.354594 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:50.354486 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" event={"ID":"03d91aea-3f10-4501-93f1-7581e0a15fa6","Type":"ContainerStarted","Data":"dc61581f24c6fb298de12cb51ac8fbbec476a62af1059dc367a64f311f345ba7"} Apr 28 20:07:50.354594 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:50.354582 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:07:51.361819 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:51.361771 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" event={"ID":"03d91aea-3f10-4501-93f1-7581e0a15fa6","Type":"ContainerStarted","Data":"726388c4b3bcac27989be26311cbaf48ffce94ef7a7c9ab1204fd3c001b4dea4"} Apr 28 20:07:53.251652 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:53.251610 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" podUID="cc21cda1-2baf-45c3-9e09-d68413fa00b4" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 28 20:07:55.293319 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:55.293277 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" podUID="1bbd05f9-5b60-4859-9805-7f5481687c38" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 28 20:07:55.380466 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:55.380432 2565 generic.go:358] "Generic (PLEG): container finished" podID="2e040453-eac7-4f57-97d1-7a73dd2144f0" containerID="631d45316582157b2408ef0a28e83b233871944683684f4e27cc64c6288c37f1" exitCode=0 Apr 28 20:07:55.380634 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:55.380504 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" event={"ID":"2e040453-eac7-4f57-97d1-7a73dd2144f0","Type":"ContainerDied","Data":"631d45316582157b2408ef0a28e83b233871944683684f4e27cc64c6288c37f1"} Apr 28 20:07:56.386602 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:56.386568 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" event={"ID":"2e040453-eac7-4f57-97d1-7a73dd2144f0","Type":"ContainerStarted","Data":"47f5a243e23a2988647fdd20d77c72326724cc2dab32ab82b1d9b120ec3741e6"} Apr 28 20:07:56.414391 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:56.414318 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" podStartSLOduration=7.414298095 podStartE2EDuration="7.414298095s" podCreationTimestamp="2026-04-28 20:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:07:56.410761959 +0000 UTC m=+3060.916478662" watchObservedRunningTime="2026-04-28 20:07:56.414298095 +0000 UTC m=+3060.920014765" Apr 28 20:07:59.458655 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:59.458622 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:07:59.459262 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:59.458669 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:07:59.460062 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:07:59.460030 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" podUID="2e040453-eac7-4f57-97d1-7a73dd2144f0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 28 20:08:00.029621 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:00.029577 2565 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" podUID="cc21cda1-2baf-45c3-9e09-d68413fa00b4" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 28 20:08:00.029790 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:00.029650 2565 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 20:08:02.387614 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:02.387584 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:08:03.251821 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:03.251774 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" podUID="cc21cda1-2baf-45c3-9e09-d68413fa00b4" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 28 20:08:04.282610 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.282587 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 20:08:04.332067 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.332037 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-model-cache\") pod \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " Apr 28 20:08:04.332271 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.332079 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96bfeafd-b753-41f4-bd7d-2008b55dfd13-tls-certs\") pod \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " Apr 28 20:08:04.332271 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.332104 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwkfr\" (UniqueName: \"kubernetes.io/projected/96bfeafd-b753-41f4-bd7d-2008b55dfd13-kube-api-access-jwkfr\") pod \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " Apr 28 20:08:04.332271 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.332149 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-tmp-dir\") pod \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " Apr 28 20:08:04.332271 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.332188 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-kserve-provision-location\") pod \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " Apr 28 20:08:04.332271 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.332260 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-dshm\") pod \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " Apr 28 20:08:04.332574 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.332284 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-home\") pod \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\" (UID: \"96bfeafd-b753-41f4-bd7d-2008b55dfd13\") " Apr 28 20:08:04.332574 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.332330 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-model-cache" (OuterVolumeSpecName: "model-cache") pod "96bfeafd-b753-41f4-bd7d-2008b55dfd13" (UID: "96bfeafd-b753-41f4-bd7d-2008b55dfd13"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:08:04.332682 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.332670 2565 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-model-cache\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:08:04.333330 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.333302 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-home" (OuterVolumeSpecName: "home") pod "96bfeafd-b753-41f4-bd7d-2008b55dfd13" (UID: "96bfeafd-b753-41f4-bd7d-2008b55dfd13"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:08:04.334835 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.334802 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-dshm" (OuterVolumeSpecName: "dshm") pod "96bfeafd-b753-41f4-bd7d-2008b55dfd13" (UID: "96bfeafd-b753-41f4-bd7d-2008b55dfd13"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:08:04.335990 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.335965 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96bfeafd-b753-41f4-bd7d-2008b55dfd13-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "96bfeafd-b753-41f4-bd7d-2008b55dfd13" (UID: "96bfeafd-b753-41f4-bd7d-2008b55dfd13"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:08:04.337397 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.337368 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96bfeafd-b753-41f4-bd7d-2008b55dfd13-kube-api-access-jwkfr" (OuterVolumeSpecName: "kube-api-access-jwkfr") pod "96bfeafd-b753-41f4-bd7d-2008b55dfd13" (UID: "96bfeafd-b753-41f4-bd7d-2008b55dfd13"). InnerVolumeSpecName "kube-api-access-jwkfr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:08:04.352003 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.351963 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "96bfeafd-b753-41f4-bd7d-2008b55dfd13" (UID: "96bfeafd-b753-41f4-bd7d-2008b55dfd13"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:08:04.368823 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.368781 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "96bfeafd-b753-41f4-bd7d-2008b55dfd13" (UID: "96bfeafd-b753-41f4-bd7d-2008b55dfd13"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:08:04.421127 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.421085 2565 generic.go:358] "Generic (PLEG): container finished" podID="96bfeafd-b753-41f4-bd7d-2008b55dfd13" containerID="fe01014b4b7508fd4640a01ddddd5a017f09024f68ce03533ac7abd8938d920c" exitCode=137 Apr 28 20:08:04.421316 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.421235 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" Apr 28 20:08:04.421316 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.421234 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" event={"ID":"96bfeafd-b753-41f4-bd7d-2008b55dfd13","Type":"ContainerDied","Data":"fe01014b4b7508fd4640a01ddddd5a017f09024f68ce03533ac7abd8938d920c"} Apr 28 20:08:04.421443 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.421358 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn" event={"ID":"96bfeafd-b753-41f4-bd7d-2008b55dfd13","Type":"ContainerDied","Data":"b1cbad9b19478d3689fd2d97dc3f5ca201aa8645335846f25dbe73f11d238e6e"} Apr 28 20:08:04.421443 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.421385 2565 scope.go:117] "RemoveContainer" containerID="fe01014b4b7508fd4640a01ddddd5a017f09024f68ce03533ac7abd8938d920c" Apr 28 20:08:04.423101 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.423080 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz_cc21cda1-2baf-45c3-9e09-d68413fa00b4/storage-initializer/0.log" Apr 28 20:08:04.423602 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.423572 2565 generic.go:358] "Generic (PLEG): container finished" podID="cc21cda1-2baf-45c3-9e09-d68413fa00b4" containerID="d986bd2ca47266e9574628c4e983987d51a1698c1d58d066ce63c68e1b4b6e6f" exitCode=137 Apr 28 20:08:04.423718 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.423606 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" event={"ID":"cc21cda1-2baf-45c3-9e09-d68413fa00b4","Type":"ContainerDied","Data":"d986bd2ca47266e9574628c4e983987d51a1698c1d58d066ce63c68e1b4b6e6f"} Apr 28 20:08:04.432290 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.432263 2565 scope.go:117] "RemoveContainer" containerID="7140e0c6009495f6e6322564c6010d27bd8a920d1593fb18b7d2ac80b07bfe01" Apr 28 20:08:04.433413 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.433313 2565 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-dshm\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:08:04.433413 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.433342 2565 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-home\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:08:04.433413 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.433356 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96bfeafd-b753-41f4-bd7d-2008b55dfd13-tls-certs\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:08:04.433413 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.433372 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jwkfr\" (UniqueName: \"kubernetes.io/projected/96bfeafd-b753-41f4-bd7d-2008b55dfd13-kube-api-access-jwkfr\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:08:04.433413 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.433389 2565 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-tmp-dir\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:08:04.433413 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.433405 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96bfeafd-b753-41f4-bd7d-2008b55dfd13-kserve-provision-location\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:08:04.449056 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.448725 2565 scope.go:117] "RemoveContainer" containerID="fe01014b4b7508fd4640a01ddddd5a017f09024f68ce03533ac7abd8938d920c" Apr 28 20:08:04.450202 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.450143 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn"] Apr 28 20:08:04.450823 ip-10-0-141-41 kubenswrapper[2565]: E0428 20:08:04.450660 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe01014b4b7508fd4640a01ddddd5a017f09024f68ce03533ac7abd8938d920c\": container with ID starting with fe01014b4b7508fd4640a01ddddd5a017f09024f68ce03533ac7abd8938d920c not found: ID does not exist" containerID="fe01014b4b7508fd4640a01ddddd5a017f09024f68ce03533ac7abd8938d920c" Apr 28 20:08:04.450823 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.450694 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe01014b4b7508fd4640a01ddddd5a017f09024f68ce03533ac7abd8938d920c"} err="failed to get container status \"fe01014b4b7508fd4640a01ddddd5a017f09024f68ce03533ac7abd8938d920c\": rpc error: code = NotFound desc = could not find container \"fe01014b4b7508fd4640a01ddddd5a017f09024f68ce03533ac7abd8938d920c\": container with ID starting with fe01014b4b7508fd4640a01ddddd5a017f09024f68ce03533ac7abd8938d920c not found: ID does not exist" Apr 28 20:08:04.450823 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.450721 2565 scope.go:117] "RemoveContainer" containerID="7140e0c6009495f6e6322564c6010d27bd8a920d1593fb18b7d2ac80b07bfe01" Apr 28 20:08:04.451464 ip-10-0-141-41 kubenswrapper[2565]: E0428 20:08:04.451435 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7140e0c6009495f6e6322564c6010d27bd8a920d1593fb18b7d2ac80b07bfe01\": container with ID starting with 7140e0c6009495f6e6322564c6010d27bd8a920d1593fb18b7d2ac80b07bfe01 not found: ID does not exist" containerID="7140e0c6009495f6e6322564c6010d27bd8a920d1593fb18b7d2ac80b07bfe01" Apr 28 20:08:04.451566 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.451470 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7140e0c6009495f6e6322564c6010d27bd8a920d1593fb18b7d2ac80b07bfe01"} err="failed to get container status \"7140e0c6009495f6e6322564c6010d27bd8a920d1593fb18b7d2ac80b07bfe01\": rpc error: code = NotFound desc = could not find container \"7140e0c6009495f6e6322564c6010d27bd8a920d1593fb18b7d2ac80b07bfe01\": container with ID starting with 7140e0c6009495f6e6322564c6010d27bd8a920d1593fb18b7d2ac80b07bfe01 not found: ID does not exist" Apr 28 20:08:04.452824 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.452791 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-c4mggfn"] Apr 28 20:08:04.595109 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.595086 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz_cc21cda1-2baf-45c3-9e09-d68413fa00b4/storage-initializer/0.log" Apr 28 20:08:04.595581 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.595564 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 20:08:04.737080 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.736986 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-kserve-provision-location\") pod \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " Apr 28 20:08:04.737080 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.737054 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-tmp-dir\") pod \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " Apr 28 20:08:04.737344 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.737098 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-home\") pod \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " Apr 28 20:08:04.737344 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.737119 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl52k\" (UniqueName: \"kubernetes.io/projected/cc21cda1-2baf-45c3-9e09-d68413fa00b4-kube-api-access-fl52k\") pod \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " Apr 28 20:08:04.737344 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.737217 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-model-cache\") pod \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " Apr 28 20:08:04.737344 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.737243 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-dshm\") pod \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " Apr 28 20:08:04.737344 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.737268 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cc21cda1-2baf-45c3-9e09-d68413fa00b4-tls-certs\") pod \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\" (UID: \"cc21cda1-2baf-45c3-9e09-d68413fa00b4\") " Apr 28 20:08:04.737591 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.737341 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "cc21cda1-2baf-45c3-9e09-d68413fa00b4" (UID: "cc21cda1-2baf-45c3-9e09-d68413fa00b4"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:08:04.737591 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.737394 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-home" (OuterVolumeSpecName: "home") pod "cc21cda1-2baf-45c3-9e09-d68413fa00b4" (UID: "cc21cda1-2baf-45c3-9e09-d68413fa00b4"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:08:04.737591 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.737550 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-model-cache" (OuterVolumeSpecName: "model-cache") pod "cc21cda1-2baf-45c3-9e09-d68413fa00b4" (UID: "cc21cda1-2baf-45c3-9e09-d68413fa00b4"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:08:04.737749 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.737676 2565 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-tmp-dir\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:08:04.737749 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.737697 2565 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-home\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:08:04.737749 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.737714 2565 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-model-cache\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:08:04.739583 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.739542 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc21cda1-2baf-45c3-9e09-d68413fa00b4-kube-api-access-fl52k" (OuterVolumeSpecName: "kube-api-access-fl52k") pod "cc21cda1-2baf-45c3-9e09-d68413fa00b4" (UID: "cc21cda1-2baf-45c3-9e09-d68413fa00b4"). InnerVolumeSpecName "kube-api-access-fl52k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:08:04.739938 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.739919 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-dshm" (OuterVolumeSpecName: "dshm") pod "cc21cda1-2baf-45c3-9e09-d68413fa00b4" (UID: "cc21cda1-2baf-45c3-9e09-d68413fa00b4"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:08:04.740012 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.739992 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc21cda1-2baf-45c3-9e09-d68413fa00b4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "cc21cda1-2baf-45c3-9e09-d68413fa00b4" (UID: "cc21cda1-2baf-45c3-9e09-d68413fa00b4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:08:04.755741 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.755711 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cc21cda1-2baf-45c3-9e09-d68413fa00b4" (UID: "cc21cda1-2baf-45c3-9e09-d68413fa00b4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:08:04.839126 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.839086 2565 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-dshm\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:08:04.839126 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.839116 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cc21cda1-2baf-45c3-9e09-d68413fa00b4-tls-certs\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:08:04.839126 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.839133 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc21cda1-2baf-45c3-9e09-d68413fa00b4-kserve-provision-location\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:08:04.839360 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:04.839146 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fl52k\" (UniqueName: \"kubernetes.io/projected/cc21cda1-2baf-45c3-9e09-d68413fa00b4-kube-api-access-fl52k\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:08:05.293206 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:05.293154 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" podUID="1bbd05f9-5b60-4859-9805-7f5481687c38" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 28 20:08:05.430055 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:05.430023 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz_cc21cda1-2baf-45c3-9e09-d68413fa00b4/storage-initializer/0.log" Apr 28 20:08:05.430499 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:05.430473 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" event={"ID":"cc21cda1-2baf-45c3-9e09-d68413fa00b4","Type":"ContainerDied","Data":"4d7be9913a8cb1f508d43cb9d1dcad11f809e3f21478a4f10b3644906d4eae2b"} Apr 28 20:08:05.430577 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:05.430513 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz" Apr 28 20:08:05.430577 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:05.430523 2565 scope.go:117] "RemoveContainer" containerID="d986bd2ca47266e9574628c4e983987d51a1698c1d58d066ce63c68e1b4b6e6f" Apr 28 20:08:05.455191 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:05.455152 2565 scope.go:117] "RemoveContainer" containerID="cce42369c4bde871de2b2312c42713320104202ae411f70816adc2439077bf7e" Apr 28 20:08:05.476618 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:05.475582 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz"] Apr 28 20:08:05.484352 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:05.484321 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7f8f847f5-lfzcz"] Apr 28 20:08:06.111377 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:06.111340 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96bfeafd-b753-41f4-bd7d-2008b55dfd13" path="/var/lib/kubelet/pods/96bfeafd-b753-41f4-bd7d-2008b55dfd13/volumes" Apr 28 20:08:06.111968 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:06.111942 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc21cda1-2baf-45c3-9e09-d68413fa00b4" path="/var/lib/kubelet/pods/cc21cda1-2baf-45c3-9e09-d68413fa00b4/volumes" Apr 28 20:08:09.459274 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:09.459223 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" podUID="2e040453-eac7-4f57-97d1-7a73dd2144f0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 28 20:08:15.293152 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:15.293100 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" podUID="1bbd05f9-5b60-4859-9805-7f5481687c38" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 28 20:08:19.458997 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:19.458956 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" podUID="2e040453-eac7-4f57-97d1-7a73dd2144f0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 28 20:08:25.292968 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:25.292907 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" podUID="1bbd05f9-5b60-4859-9805-7f5481687c38" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 28 20:08:29.458455 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:29.458416 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" podUID="2e040453-eac7-4f57-97d1-7a73dd2144f0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 28 20:08:35.302737 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:35.302701 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:08:35.317828 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:35.317797 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:08:36.241899 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:36.241863 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p"] Apr 28 20:08:36.548626 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:36.548556 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" podUID="1bbd05f9-5b60-4859-9805-7f5481687c38" containerName="main" containerID="cri-o://ed07590a7beab4a0749f49f3fd92f6895257eaf0fab5471736575b4952ddd4dd" gracePeriod=30 Apr 28 20:08:39.458665 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:39.458626 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" podUID="2e040453-eac7-4f57-97d1-7a73dd2144f0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 28 20:08:49.459738 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:49.459685 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" podUID="2e040453-eac7-4f57-97d1-7a73dd2144f0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 28 20:08:54.077413 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.077375 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn"] Apr 28 20:08:54.077878 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.077858 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc21cda1-2baf-45c3-9e09-d68413fa00b4" containerName="storage-initializer" Apr 28 20:08:54.078018 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.077881 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc21cda1-2baf-45c3-9e09-d68413fa00b4" containerName="storage-initializer" Apr 28 20:08:54.078018 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.077895 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96bfeafd-b753-41f4-bd7d-2008b55dfd13" containerName="main" Apr 28 20:08:54.078018 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.077904 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="96bfeafd-b753-41f4-bd7d-2008b55dfd13" containerName="main" Apr 28 20:08:54.078018 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.077918 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc21cda1-2baf-45c3-9e09-d68413fa00b4" containerName="llm-d-routing-sidecar" Apr 28 20:08:54.078018 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.077930 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc21cda1-2baf-45c3-9e09-d68413fa00b4" containerName="llm-d-routing-sidecar" Apr 28 20:08:54.078018 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.077960 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96bfeafd-b753-41f4-bd7d-2008b55dfd13" containerName="storage-initializer" Apr 28 20:08:54.078018 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.077979 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="96bfeafd-b753-41f4-bd7d-2008b55dfd13" containerName="storage-initializer" Apr 28 20:08:54.078640 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.078095 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="cc21cda1-2baf-45c3-9e09-d68413fa00b4" containerName="storage-initializer" Apr 28 20:08:54.078640 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.078110 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="cc21cda1-2baf-45c3-9e09-d68413fa00b4" containerName="llm-d-routing-sidecar" Apr 28 20:08:54.078640 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.078127 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="96bfeafd-b753-41f4-bd7d-2008b55dfd13" containerName="main" Apr 28 20:08:54.081951 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.081933 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn" Apr 28 20:08:54.093377 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.093354 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn"] Apr 28 20:08:54.204018 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.203979 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/16877967-b3a4-4cab-bf40-3e55beed1d8c-dshm\") pod \"stop-feature-test-kserve-76b8b9b84-z2vkn\" (UID: \"16877967-b3a4-4cab-bf40-3e55beed1d8c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn" Apr 28 20:08:54.204218 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.204034 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/16877967-b3a4-4cab-bf40-3e55beed1d8c-kserve-provision-location\") pod \"stop-feature-test-kserve-76b8b9b84-z2vkn\" (UID: \"16877967-b3a4-4cab-bf40-3e55beed1d8c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn" Apr 28 20:08:54.204218 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.204068 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/16877967-b3a4-4cab-bf40-3e55beed1d8c-tls-certs\") pod \"stop-feature-test-kserve-76b8b9b84-z2vkn\" (UID: \"16877967-b3a4-4cab-bf40-3e55beed1d8c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn" Apr 28 20:08:54.204218 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.204199 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/16877967-b3a4-4cab-bf40-3e55beed1d8c-home\") pod \"stop-feature-test-kserve-76b8b9b84-z2vkn\" (UID: \"16877967-b3a4-4cab-bf40-3e55beed1d8c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn" Apr 28 20:08:54.204360 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.204320 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/16877967-b3a4-4cab-bf40-3e55beed1d8c-tmp-dir\") pod \"stop-feature-test-kserve-76b8b9b84-z2vkn\" (UID: \"16877967-b3a4-4cab-bf40-3e55beed1d8c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn" Apr 28 20:08:54.204360 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.204354 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/16877967-b3a4-4cab-bf40-3e55beed1d8c-model-cache\") pod \"stop-feature-test-kserve-76b8b9b84-z2vkn\" (UID: \"16877967-b3a4-4cab-bf40-3e55beed1d8c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn" Apr 28 20:08:54.204442 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.204379 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5482\" (UniqueName: \"kubernetes.io/projected/16877967-b3a4-4cab-bf40-3e55beed1d8c-kube-api-access-p5482\") pod \"stop-feature-test-kserve-76b8b9b84-z2vkn\" (UID: \"16877967-b3a4-4cab-bf40-3e55beed1d8c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn" Apr 28 20:08:54.305058 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.305025 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/16877967-b3a4-4cab-bf40-3e55beed1d8c-home\") pod \"stop-feature-test-kserve-76b8b9b84-z2vkn\" (UID: \"16877967-b3a4-4cab-bf40-3e55beed1d8c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn" Apr 28 20:08:54.305231 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.305085 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/16877967-b3a4-4cab-bf40-3e55beed1d8c-tmp-dir\") pod \"stop-feature-test-kserve-76b8b9b84-z2vkn\" (UID: \"16877967-b3a4-4cab-bf40-3e55beed1d8c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn" Apr 28 20:08:54.305231 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.305103 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/16877967-b3a4-4cab-bf40-3e55beed1d8c-model-cache\") pod \"stop-feature-test-kserve-76b8b9b84-z2vkn\" (UID: \"16877967-b3a4-4cab-bf40-3e55beed1d8c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn" Apr 28 20:08:54.305231 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.305121 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5482\" (UniqueName: \"kubernetes.io/projected/16877967-b3a4-4cab-bf40-3e55beed1d8c-kube-api-access-p5482\") pod \"stop-feature-test-kserve-76b8b9b84-z2vkn\" (UID: \"16877967-b3a4-4cab-bf40-3e55beed1d8c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn" Apr 28 20:08:54.305231 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.305153 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/16877967-b3a4-4cab-bf40-3e55beed1d8c-dshm\") pod \"stop-feature-test-kserve-76b8b9b84-z2vkn\" (UID: \"16877967-b3a4-4cab-bf40-3e55beed1d8c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn" Apr 28 20:08:54.305231 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.305207 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/16877967-b3a4-4cab-bf40-3e55beed1d8c-kserve-provision-location\") pod \"stop-feature-test-kserve-76b8b9b84-z2vkn\" (UID: \"16877967-b3a4-4cab-bf40-3e55beed1d8c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn" Apr 28 20:08:54.305446 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.305241 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/16877967-b3a4-4cab-bf40-3e55beed1d8c-tls-certs\") pod \"stop-feature-test-kserve-76b8b9b84-z2vkn\" (UID: \"16877967-b3a4-4cab-bf40-3e55beed1d8c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn" Apr 28 20:08:54.305541 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.305517 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/16877967-b3a4-4cab-bf40-3e55beed1d8c-model-cache\") pod \"stop-feature-test-kserve-76b8b9b84-z2vkn\" (UID: \"16877967-b3a4-4cab-bf40-3e55beed1d8c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn" Apr 28 20:08:54.305588 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.305517 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/16877967-b3a4-4cab-bf40-3e55beed1d8c-home\") pod \"stop-feature-test-kserve-76b8b9b84-z2vkn\" (UID: \"16877967-b3a4-4cab-bf40-3e55beed1d8c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn" Apr 28 20:08:54.305778 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.305758 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/16877967-b3a4-4cab-bf40-3e55beed1d8c-tmp-dir\") pod \"stop-feature-test-kserve-76b8b9b84-z2vkn\" (UID: \"16877967-b3a4-4cab-bf40-3e55beed1d8c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn" Apr 28 20:08:54.305871 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.305838 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/16877967-b3a4-4cab-bf40-3e55beed1d8c-kserve-provision-location\") pod \"stop-feature-test-kserve-76b8b9b84-z2vkn\" (UID: \"16877967-b3a4-4cab-bf40-3e55beed1d8c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn" Apr 28 20:08:54.307570 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.307545 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/16877967-b3a4-4cab-bf40-3e55beed1d8c-dshm\") pod \"stop-feature-test-kserve-76b8b9b84-z2vkn\" (UID: \"16877967-b3a4-4cab-bf40-3e55beed1d8c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn" Apr 28 20:08:54.307710 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.307693 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/16877967-b3a4-4cab-bf40-3e55beed1d8c-tls-certs\") pod \"stop-feature-test-kserve-76b8b9b84-z2vkn\" (UID: \"16877967-b3a4-4cab-bf40-3e55beed1d8c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn" Apr 28 20:08:54.328675 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.328600 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5482\" (UniqueName: \"kubernetes.io/projected/16877967-b3a4-4cab-bf40-3e55beed1d8c-kube-api-access-p5482\") pod \"stop-feature-test-kserve-76b8b9b84-z2vkn\" (UID: \"16877967-b3a4-4cab-bf40-3e55beed1d8c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn" Apr 28 20:08:54.395219 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.395183 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn" Apr 28 20:08:54.540296 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.540211 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn"] Apr 28 20:08:54.542880 ip-10-0-141-41 kubenswrapper[2565]: W0428 20:08:54.542849 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16877967_b3a4_4cab_bf40_3e55beed1d8c.slice/crio-41f5ebe65571601e22cccb665d60481d4973ad0eef9db6c94e417481dd6a79e4 WatchSource:0}: Error finding container 41f5ebe65571601e22cccb665d60481d4973ad0eef9db6c94e417481dd6a79e4: Status 404 returned error can't find the container with id 41f5ebe65571601e22cccb665d60481d4973ad0eef9db6c94e417481dd6a79e4 Apr 28 20:08:54.613862 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.613832 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn" event={"ID":"16877967-b3a4-4cab-bf40-3e55beed1d8c","Type":"ContainerStarted","Data":"91403c14134a23919336c090a732487723632f3e08f1ed008fbce38ba9bd0cf0"} Apr 28 20:08:54.613996 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:54.613869 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-z2vkn" event={"ID":"16877967-b3a4-4cab-bf40-3e55beed1d8c","Type":"ContainerStarted","Data":"41f5ebe65571601e22cccb665d60481d4973ad0eef9db6c94e417481dd6a79e4"} Apr 28 20:08:59.458633 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:08:59.458591 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" podUID="2e040453-eac7-4f57-97d1-7a73dd2144f0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 28 20:09:06.839938 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:06.839908 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-76b8b9b84-fms4p_1bbd05f9-5b60-4859-9805-7f5481687c38/main/0.log" Apr 28 20:09:06.840307 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:06.840278 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:09:07.029651 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.029620 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-tmp-dir\") pod \"1bbd05f9-5b60-4859-9805-7f5481687c38\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " Apr 28 20:09:07.029651 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.029654 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-home\") pod \"1bbd05f9-5b60-4859-9805-7f5481687c38\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " Apr 28 20:09:07.029910 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.029676 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-dshm\") pod \"1bbd05f9-5b60-4859-9805-7f5481687c38\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " Apr 28 20:09:07.029910 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.029696 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-kserve-provision-location\") pod \"1bbd05f9-5b60-4859-9805-7f5481687c38\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " Apr 28 20:09:07.029910 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.029749 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1bbd05f9-5b60-4859-9805-7f5481687c38-tls-certs\") pod \"1bbd05f9-5b60-4859-9805-7f5481687c38\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " Apr 28 20:09:07.029910 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.029772 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-model-cache\") pod \"1bbd05f9-5b60-4859-9805-7f5481687c38\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " Apr 28 20:09:07.029910 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.029814 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrpcv\" (UniqueName: \"kubernetes.io/projected/1bbd05f9-5b60-4859-9805-7f5481687c38-kube-api-access-qrpcv\") pod \"1bbd05f9-5b60-4859-9805-7f5481687c38\" (UID: \"1bbd05f9-5b60-4859-9805-7f5481687c38\") " Apr 28 20:09:07.030313 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.030261 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-model-cache" (OuterVolumeSpecName: "model-cache") pod "1bbd05f9-5b60-4859-9805-7f5481687c38" (UID: "1bbd05f9-5b60-4859-9805-7f5481687c38"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:09:07.030426 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.030305 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-home" (OuterVolumeSpecName: "home") pod "1bbd05f9-5b60-4859-9805-7f5481687c38" (UID: "1bbd05f9-5b60-4859-9805-7f5481687c38"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:09:07.031975 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.031950 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-dshm" (OuterVolumeSpecName: "dshm") pod "1bbd05f9-5b60-4859-9805-7f5481687c38" (UID: "1bbd05f9-5b60-4859-9805-7f5481687c38"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:09:07.032367 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.032344 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bbd05f9-5b60-4859-9805-7f5481687c38-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1bbd05f9-5b60-4859-9805-7f5481687c38" (UID: "1bbd05f9-5b60-4859-9805-7f5481687c38"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:09:07.032460 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.032414 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bbd05f9-5b60-4859-9805-7f5481687c38-kube-api-access-qrpcv" (OuterVolumeSpecName: "kube-api-access-qrpcv") pod "1bbd05f9-5b60-4859-9805-7f5481687c38" (UID: "1bbd05f9-5b60-4859-9805-7f5481687c38"). InnerVolumeSpecName "kube-api-access-qrpcv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:09:07.042981 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.042956 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "1bbd05f9-5b60-4859-9805-7f5481687c38" (UID: "1bbd05f9-5b60-4859-9805-7f5481687c38"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:09:07.106639 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.106547 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1bbd05f9-5b60-4859-9805-7f5481687c38" (UID: "1bbd05f9-5b60-4859-9805-7f5481687c38"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:09:07.131155 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.131124 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1bbd05f9-5b60-4859-9805-7f5481687c38-tls-certs\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:09:07.131155 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.131153 2565 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-model-cache\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:09:07.131295 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.131176 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qrpcv\" (UniqueName: \"kubernetes.io/projected/1bbd05f9-5b60-4859-9805-7f5481687c38-kube-api-access-qrpcv\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:09:07.131295 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.131185 2565 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-tmp-dir\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:09:07.131295 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.131194 2565 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-home\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:09:07.131295 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.131202 2565 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-dshm\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:09:07.131295 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.131210 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1bbd05f9-5b60-4859-9805-7f5481687c38-kserve-provision-location\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:09:07.663636 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.663609 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-76b8b9b84-fms4p_1bbd05f9-5b60-4859-9805-7f5481687c38/main/0.log" Apr 28 20:09:07.664008 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.663983 2565 generic.go:358] "Generic (PLEG): container finished" podID="1bbd05f9-5b60-4859-9805-7f5481687c38" containerID="ed07590a7beab4a0749f49f3fd92f6895257eaf0fab5471736575b4952ddd4dd" exitCode=137 Apr 28 20:09:07.664094 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.664052 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" event={"ID":"1bbd05f9-5b60-4859-9805-7f5481687c38","Type":"ContainerDied","Data":"ed07590a7beab4a0749f49f3fd92f6895257eaf0fab5471736575b4952ddd4dd"} Apr 28 20:09:07.664094 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.664075 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" Apr 28 20:09:07.664236 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.664100 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p" event={"ID":"1bbd05f9-5b60-4859-9805-7f5481687c38","Type":"ContainerDied","Data":"876b0acf80a80121aafb7c9140c1d27d7c42991cbef91bd8eb7a2c1135edd9ab"} Apr 28 20:09:07.664236 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.664122 2565 scope.go:117] "RemoveContainer" containerID="ed07590a7beab4a0749f49f3fd92f6895257eaf0fab5471736575b4952ddd4dd" Apr 28 20:09:07.672684 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.672667 2565 scope.go:117] "RemoveContainer" containerID="e0f0db98f42e54a0fee30e06323db2aaf9c9a90330a45952e0592b002c56c08a" Apr 28 20:09:07.687126 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.687105 2565 scope.go:117] "RemoveContainer" containerID="ed07590a7beab4a0749f49f3fd92f6895257eaf0fab5471736575b4952ddd4dd" Apr 28 20:09:07.687435 ip-10-0-141-41 kubenswrapper[2565]: E0428 20:09:07.687411 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed07590a7beab4a0749f49f3fd92f6895257eaf0fab5471736575b4952ddd4dd\": container with ID starting with ed07590a7beab4a0749f49f3fd92f6895257eaf0fab5471736575b4952ddd4dd not found: ID does not exist" containerID="ed07590a7beab4a0749f49f3fd92f6895257eaf0fab5471736575b4952ddd4dd" Apr 28 20:09:07.687519 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.687442 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed07590a7beab4a0749f49f3fd92f6895257eaf0fab5471736575b4952ddd4dd"} err="failed to get container status \"ed07590a7beab4a0749f49f3fd92f6895257eaf0fab5471736575b4952ddd4dd\": rpc error: code = NotFound desc = could not find container \"ed07590a7beab4a0749f49f3fd92f6895257eaf0fab5471736575b4952ddd4dd\": container with ID starting with ed07590a7beab4a0749f49f3fd92f6895257eaf0fab5471736575b4952ddd4dd not found: ID does not exist" Apr 28 20:09:07.687519 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.687459 2565 scope.go:117] "RemoveContainer" containerID="e0f0db98f42e54a0fee30e06323db2aaf9c9a90330a45952e0592b002c56c08a" Apr 28 20:09:07.687696 ip-10-0-141-41 kubenswrapper[2565]: E0428 20:09:07.687679 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0f0db98f42e54a0fee30e06323db2aaf9c9a90330a45952e0592b002c56c08a\": container with ID starting with e0f0db98f42e54a0fee30e06323db2aaf9c9a90330a45952e0592b002c56c08a not found: ID does not exist" containerID="e0f0db98f42e54a0fee30e06323db2aaf9c9a90330a45952e0592b002c56c08a" Apr 28 20:09:07.687736 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.687702 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0f0db98f42e54a0fee30e06323db2aaf9c9a90330a45952e0592b002c56c08a"} err="failed to get container status \"e0f0db98f42e54a0fee30e06323db2aaf9c9a90330a45952e0592b002c56c08a\": rpc error: code = NotFound desc = could not find container \"e0f0db98f42e54a0fee30e06323db2aaf9c9a90330a45952e0592b002c56c08a\": container with ID starting with e0f0db98f42e54a0fee30e06323db2aaf9c9a90330a45952e0592b002c56c08a not found: ID does not exist" Apr 28 20:09:07.704702 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.704668 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p"] Apr 28 20:09:07.713560 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:07.713533 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-76b8b9b84-fms4p"] Apr 28 20:09:08.113088 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:08.113050 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bbd05f9-5b60-4859-9805-7f5481687c38" path="/var/lib/kubelet/pods/1bbd05f9-5b60-4859-9805-7f5481687c38/volumes" Apr 28 20:09:09.458980 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:09.458929 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" podUID="2e040453-eac7-4f57-97d1-7a73dd2144f0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 28 20:09:19.459639 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:19.459590 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" podUID="2e040453-eac7-4f57-97d1-7a73dd2144f0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 28 20:09:29.469724 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:29.469690 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:09:29.477501 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:09:29.477475 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:22:37.055203 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:37.055154 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p"] Apr 28 20:22:37.057640 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:37.055452 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" podUID="2e040453-eac7-4f57-97d1-7a73dd2144f0" containerName="main" containerID="cri-o://47f5a243e23a2988647fdd20d77c72326724cc2dab32ab82b1d9b120ec3741e6" gracePeriod=30 Apr 28 20:22:37.060929 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:37.060903 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc"] Apr 28 20:22:37.061194 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:37.061173 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" podUID="03d91aea-3f10-4501-93f1-7581e0a15fa6" containerName="llm-d-routing-sidecar" containerID="cri-o://1561d4545b086e7d5a7cf9b969629a4159c1c0065eee24c3a2d254b476d9206c" gracePeriod=30 Apr 28 20:22:37.061257 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:37.061210 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" podUID="03d91aea-3f10-4501-93f1-7581e0a15fa6" containerName="storage-initializer" containerID="cri-o://726388c4b3bcac27989be26311cbaf48ffce94ef7a7c9ab1204fd3c001b4dea4" gracePeriod=30 Apr 28 20:22:37.454283 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:37.454203 2565 generic.go:358] "Generic (PLEG): container finished" podID="03d91aea-3f10-4501-93f1-7581e0a15fa6" containerID="1561d4545b086e7d5a7cf9b969629a4159c1c0065eee24c3a2d254b476d9206c" exitCode=0 Apr 28 20:22:37.454283 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:37.454255 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" event={"ID":"03d91aea-3f10-4501-93f1-7581e0a15fa6","Type":"ContainerDied","Data":"1561d4545b086e7d5a7cf9b969629a4159c1c0065eee24c3a2d254b476d9206c"} Apr 28 20:22:39.449277 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:39.449229 2565 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" podUID="03d91aea-3f10-4501-93f1-7581e0a15fa6" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 28 20:22:42.369687 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:42.369647 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" podUID="03d91aea-3f10-4501-93f1-7581e0a15fa6" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 28 20:22:47.695905 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.695864 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm"] Apr 28 20:22:47.696330 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.696302 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1bbd05f9-5b60-4859-9805-7f5481687c38" containerName="storage-initializer" Apr 28 20:22:47.696330 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.696315 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bbd05f9-5b60-4859-9805-7f5481687c38" containerName="storage-initializer" Apr 28 20:22:47.696330 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.696327 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1bbd05f9-5b60-4859-9805-7f5481687c38" containerName="main" Apr 28 20:22:47.696507 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.696335 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bbd05f9-5b60-4859-9805-7f5481687c38" containerName="main" Apr 28 20:22:47.696507 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.696391 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="1bbd05f9-5b60-4859-9805-7f5481687c38" containerName="main" Apr 28 20:22:47.700074 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.700051 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:22:47.703088 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.703066 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-j2lpr\"" Apr 28 20:22:47.703243 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.703103 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 28 20:22:47.712918 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.712894 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp"] Apr 28 20:22:47.717922 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.717898 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm"] Apr 28 20:22:47.718046 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.718032 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:22:47.725834 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.725764 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp"] Apr 28 20:22:47.804348 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.804315 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-tls-certs\") pod \"router-with-refs-pd-test-kserve-7b796954f4-44vdm\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:22:47.804535 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.804376 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj97q\" (UniqueName: \"kubernetes.io/projected/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-kube-api-access-kj97q\") pod \"router-with-refs-pd-test-kserve-7b796954f4-44vdm\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:22:47.804535 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.804431 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-home\") pod \"router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:22:47.804535 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.804481 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-tmp-dir\") pod \"router-with-refs-pd-test-kserve-7b796954f4-44vdm\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:22:47.804535 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.804509 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:22:47.804535 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.804527 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-model-cache\") pod \"router-with-refs-pd-test-kserve-7b796954f4-44vdm\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:22:47.804710 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.804556 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:22:47.804710 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.804611 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8jzw\" (UniqueName: \"kubernetes.io/projected/01fdfbc0-ffda-4f56-99aa-7a0fab093146-kube-api-access-d8jzw\") pod \"router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:22:47.804710 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.804632 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-home\") pod \"router-with-refs-pd-test-kserve-7b796954f4-44vdm\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:22:47.804710 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.804668 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-tmp-dir\") pod \"router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:22:47.804710 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.804684 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-dshm\") pod \"router-with-refs-pd-test-kserve-7b796954f4-44vdm\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:22:47.804710 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.804707 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-7b796954f4-44vdm\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:22:47.804966 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.804729 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/01fdfbc0-ffda-4f56-99aa-7a0fab093146-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:22:47.804966 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.804862 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:22:47.905993 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.905960 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-tmp-dir\") pod \"router-with-refs-pd-test-kserve-7b796954f4-44vdm\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:22:47.905993 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.905997 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:22:47.906211 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.906015 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-model-cache\") pod \"router-with-refs-pd-test-kserve-7b796954f4-44vdm\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:22:47.906315 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.906290 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:22:47.906370 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.906350 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d8jzw\" (UniqueName: \"kubernetes.io/projected/01fdfbc0-ffda-4f56-99aa-7a0fab093146-kube-api-access-d8jzw\") pod \"router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:22:47.906430 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.906404 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-model-cache\") pod \"router-with-refs-pd-test-kserve-7b796954f4-44vdm\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:22:47.906482 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.906438 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:22:47.906482 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.906457 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-home\") pod \"router-with-refs-pd-test-kserve-7b796954f4-44vdm\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:22:47.906578 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.906457 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-tmp-dir\") pod \"router-with-refs-pd-test-kserve-7b796954f4-44vdm\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:22:47.906578 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.906489 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-tmp-dir\") pod \"router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:22:47.906578 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.906528 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-dshm\") pod \"router-with-refs-pd-test-kserve-7b796954f4-44vdm\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:22:47.906578 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.906570 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-7b796954f4-44vdm\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:22:47.906835 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.906577 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:22:47.906835 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.906611 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/01fdfbc0-ffda-4f56-99aa-7a0fab093146-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:22:47.906835 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.906661 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:22:47.906835 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.906669 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-home\") pod \"router-with-refs-pd-test-kserve-7b796954f4-44vdm\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:22:47.906835 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.906720 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-tmp-dir\") pod \"router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:22:47.906835 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.906756 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-tls-certs\") pod \"router-with-refs-pd-test-kserve-7b796954f4-44vdm\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:22:47.906835 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.906805 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kj97q\" (UniqueName: \"kubernetes.io/projected/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-kube-api-access-kj97q\") pod \"router-with-refs-pd-test-kserve-7b796954f4-44vdm\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:22:47.906835 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.906838 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-home\") pod \"router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:22:47.907320 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.906873 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-7b796954f4-44vdm\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:22:47.907320 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.907133 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-home\") pod \"router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:22:47.909291 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.909261 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:22:47.909423 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.909300 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-dshm\") pod \"router-with-refs-pd-test-kserve-7b796954f4-44vdm\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:22:47.915075 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.909896 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-tls-certs\") pod \"router-with-refs-pd-test-kserve-7b796954f4-44vdm\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:22:47.915075 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.910276 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/01fdfbc0-ffda-4f56-99aa-7a0fab093146-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:22:47.922438 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.922413 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8jzw\" (UniqueName: \"kubernetes.io/projected/01fdfbc0-ffda-4f56-99aa-7a0fab093146-kube-api-access-d8jzw\") pod \"router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:22:47.922536 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:47.922437 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj97q\" (UniqueName: \"kubernetes.io/projected/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-kube-api-access-kj97q\") pod \"router-with-refs-pd-test-kserve-7b796954f4-44vdm\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:22:48.012529 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:48.012496 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:22:48.030381 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:48.030351 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:22:48.152190 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:48.152150 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm"] Apr 28 20:22:48.154689 ip-10-0-141-41 kubenswrapper[2565]: W0428 20:22:48.154660 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod987dfbe2_20b7_4b7d_956e_cd6b8f618ba7.slice/crio-f8b7cc9ca11e0f75cf9b85298f86332d04c988a6f0937f7af0876220c56ce095 WatchSource:0}: Error finding container f8b7cc9ca11e0f75cf9b85298f86332d04c988a6f0937f7af0876220c56ce095: Status 404 returned error can't find the container with id f8b7cc9ca11e0f75cf9b85298f86332d04c988a6f0937f7af0876220c56ce095 Apr 28 20:22:48.156648 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:48.156629 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 20:22:48.380117 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:48.380090 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp"] Apr 28 20:22:48.382515 ip-10-0-141-41 kubenswrapper[2565]: W0428 20:22:48.382488 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01fdfbc0_ffda_4f56_99aa_7a0fab093146.slice/crio-8d0e1c06a561ebc462eaa14aa9e95becac7ca830ace76177ae960bdcea68268f WatchSource:0}: Error finding container 8d0e1c06a561ebc462eaa14aa9e95becac7ca830ace76177ae960bdcea68268f: Status 404 returned error can't find the container with id 8d0e1c06a561ebc462eaa14aa9e95becac7ca830ace76177ae960bdcea68268f Apr 28 20:22:48.499374 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:48.499335 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" event={"ID":"01fdfbc0-ffda-4f56-99aa-7a0fab093146","Type":"ContainerStarted","Data":"cfeeaf4bc875b56a2153b2ad8e8dbed68f1312fa23078af817d2550c1f772988"} Apr 28 20:22:48.499561 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:48.499380 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" event={"ID":"01fdfbc0-ffda-4f56-99aa-7a0fab093146","Type":"ContainerStarted","Data":"8d0e1c06a561ebc462eaa14aa9e95becac7ca830ace76177ae960bdcea68268f"} Apr 28 20:22:48.500774 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:48.500742 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" event={"ID":"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7","Type":"ContainerStarted","Data":"e97134452470938c3d885211b958cfa69bfee84d0fdb5efcd65988c3d710e11d"} Apr 28 20:22:48.500885 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:48.500778 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" event={"ID":"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7","Type":"ContainerStarted","Data":"f8b7cc9ca11e0f75cf9b85298f86332d04c988a6f0937f7af0876220c56ce095"} Apr 28 20:22:48.500885 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:48.500835 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:22:49.448713 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:49.448668 2565 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" podUID="03d91aea-3f10-4501-93f1-7581e0a15fa6" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 28 20:22:49.508427 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:49.508376 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" event={"ID":"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7","Type":"ContainerStarted","Data":"9ba5db7fbf0bf0c7a8c2b5db484d678a2b088e1421964badaa12dd62a0c10792"} Apr 28 20:22:52.369243 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:52.369187 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" podUID="03d91aea-3f10-4501-93f1-7581e0a15fa6" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 28 20:22:59.449234 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:59.449185 2565 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" podUID="03d91aea-3f10-4501-93f1-7581e0a15fa6" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 28 20:22:59.449673 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:22:59.449292 2565 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:23:00.524429 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:00.524396 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:23:02.369391 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:02.369351 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" podUID="03d91aea-3f10-4501-93f1-7581e0a15fa6" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 28 20:23:07.338264 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.338239 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:23:07.388800 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.388768 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-kserve-provision-location\") pod \"2e040453-eac7-4f57-97d1-7a73dd2144f0\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " Apr 28 20:23:07.388948 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.388807 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-dshm\") pod \"2e040453-eac7-4f57-97d1-7a73dd2144f0\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " Apr 28 20:23:07.388948 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.388869 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e040453-eac7-4f57-97d1-7a73dd2144f0-tls-certs\") pod \"2e040453-eac7-4f57-97d1-7a73dd2144f0\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " Apr 28 20:23:07.388948 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.388885 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s72hg\" (UniqueName: \"kubernetes.io/projected/2e040453-eac7-4f57-97d1-7a73dd2144f0-kube-api-access-s72hg\") pod \"2e040453-eac7-4f57-97d1-7a73dd2144f0\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " Apr 28 20:23:07.388948 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.388925 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-tmp-dir\") pod \"2e040453-eac7-4f57-97d1-7a73dd2144f0\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " Apr 28 20:23:07.389190 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.389072 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-model-cache\") pod \"2e040453-eac7-4f57-97d1-7a73dd2144f0\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " Apr 28 20:23:07.389190 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.389115 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-home\") pod \"2e040453-eac7-4f57-97d1-7a73dd2144f0\" (UID: \"2e040453-eac7-4f57-97d1-7a73dd2144f0\") " Apr 28 20:23:07.389385 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.389344 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-model-cache" (OuterVolumeSpecName: "model-cache") pod "2e040453-eac7-4f57-97d1-7a73dd2144f0" (UID: "2e040453-eac7-4f57-97d1-7a73dd2144f0"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:23:07.389515 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.389494 2565 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-model-cache\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:23:07.389959 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.389934 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-home" (OuterVolumeSpecName: "home") pod "2e040453-eac7-4f57-97d1-7a73dd2144f0" (UID: "2e040453-eac7-4f57-97d1-7a73dd2144f0"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:23:07.391014 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.390986 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-dshm" (OuterVolumeSpecName: "dshm") pod "2e040453-eac7-4f57-97d1-7a73dd2144f0" (UID: "2e040453-eac7-4f57-97d1-7a73dd2144f0"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:23:07.391117 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.391063 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e040453-eac7-4f57-97d1-7a73dd2144f0-kube-api-access-s72hg" (OuterVolumeSpecName: "kube-api-access-s72hg") pod "2e040453-eac7-4f57-97d1-7a73dd2144f0" (UID: "2e040453-eac7-4f57-97d1-7a73dd2144f0"). InnerVolumeSpecName "kube-api-access-s72hg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:23:07.391305 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.391287 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e040453-eac7-4f57-97d1-7a73dd2144f0-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2e040453-eac7-4f57-97d1-7a73dd2144f0" (UID: "2e040453-eac7-4f57-97d1-7a73dd2144f0"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:23:07.402705 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.402670 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "2e040453-eac7-4f57-97d1-7a73dd2144f0" (UID: "2e040453-eac7-4f57-97d1-7a73dd2144f0"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:23:07.448637 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.448607 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2e040453-eac7-4f57-97d1-7a73dd2144f0" (UID: "2e040453-eac7-4f57-97d1-7a73dd2144f0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:23:07.490195 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.490149 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e040453-eac7-4f57-97d1-7a73dd2144f0-tls-certs\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:23:07.490195 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.490197 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s72hg\" (UniqueName: \"kubernetes.io/projected/2e040453-eac7-4f57-97d1-7a73dd2144f0-kube-api-access-s72hg\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:23:07.490320 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.490210 2565 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-tmp-dir\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:23:07.490320 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.490220 2565 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-home\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:23:07.490320 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.490229 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-kserve-provision-location\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:23:07.490320 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.490237 2565 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2e040453-eac7-4f57-97d1-7a73dd2144f0-dshm\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:23:07.574460 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.574394 2565 generic.go:358] "Generic (PLEG): container finished" podID="2e040453-eac7-4f57-97d1-7a73dd2144f0" containerID="47f5a243e23a2988647fdd20d77c72326724cc2dab32ab82b1d9b120ec3741e6" exitCode=137 Apr 28 20:23:07.574559 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.574462 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" event={"ID":"2e040453-eac7-4f57-97d1-7a73dd2144f0","Type":"ContainerDied","Data":"47f5a243e23a2988647fdd20d77c72326724cc2dab32ab82b1d9b120ec3741e6"} Apr 28 20:23:07.574559 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.574495 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" event={"ID":"2e040453-eac7-4f57-97d1-7a73dd2144f0","Type":"ContainerDied","Data":"7faeddf316995dc0e945f0fbebb72b9adac5454f52e8086b8677bbe5af1e5e6c"} Apr 28 20:23:07.574559 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.574473 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p" Apr 28 20:23:07.574559 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.574516 2565 scope.go:117] "RemoveContainer" containerID="47f5a243e23a2988647fdd20d77c72326724cc2dab32ab82b1d9b120ec3741e6" Apr 28 20:23:07.576045 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.576029 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc_03d91aea-3f10-4501-93f1-7581e0a15fa6/storage-initializer/0.log" Apr 28 20:23:07.576398 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.576379 2565 generic.go:358] "Generic (PLEG): container finished" podID="03d91aea-3f10-4501-93f1-7581e0a15fa6" containerID="726388c4b3bcac27989be26311cbaf48ffce94ef7a7c9ab1204fd3c001b4dea4" exitCode=137 Apr 28 20:23:07.576483 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.576432 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" event={"ID":"03d91aea-3f10-4501-93f1-7581e0a15fa6","Type":"ContainerDied","Data":"726388c4b3bcac27989be26311cbaf48ffce94ef7a7c9ab1204fd3c001b4dea4"} Apr 28 20:23:07.582411 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.582391 2565 scope.go:117] "RemoveContainer" containerID="631d45316582157b2408ef0a28e83b233871944683684f4e27cc64c6288c37f1" Apr 28 20:23:07.596221 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.596195 2565 scope.go:117] "RemoveContainer" containerID="47f5a243e23a2988647fdd20d77c72326724cc2dab32ab82b1d9b120ec3741e6" Apr 28 20:23:07.596482 ip-10-0-141-41 kubenswrapper[2565]: E0428 20:23:07.596461 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47f5a243e23a2988647fdd20d77c72326724cc2dab32ab82b1d9b120ec3741e6\": container with ID starting with 47f5a243e23a2988647fdd20d77c72326724cc2dab32ab82b1d9b120ec3741e6 not found: ID does not exist" containerID="47f5a243e23a2988647fdd20d77c72326724cc2dab32ab82b1d9b120ec3741e6" Apr 28 20:23:07.596544 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.596502 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47f5a243e23a2988647fdd20d77c72326724cc2dab32ab82b1d9b120ec3741e6"} err="failed to get container status \"47f5a243e23a2988647fdd20d77c72326724cc2dab32ab82b1d9b120ec3741e6\": rpc error: code = NotFound desc = could not find container \"47f5a243e23a2988647fdd20d77c72326724cc2dab32ab82b1d9b120ec3741e6\": container with ID starting with 47f5a243e23a2988647fdd20d77c72326724cc2dab32ab82b1d9b120ec3741e6 not found: ID does not exist" Apr 28 20:23:07.596544 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.596525 2565 scope.go:117] "RemoveContainer" containerID="631d45316582157b2408ef0a28e83b233871944683684f4e27cc64c6288c37f1" Apr 28 20:23:07.596739 ip-10-0-141-41 kubenswrapper[2565]: E0428 20:23:07.596724 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"631d45316582157b2408ef0a28e83b233871944683684f4e27cc64c6288c37f1\": container with ID starting with 631d45316582157b2408ef0a28e83b233871944683684f4e27cc64c6288c37f1 not found: ID does not exist" containerID="631d45316582157b2408ef0a28e83b233871944683684f4e27cc64c6288c37f1" Apr 28 20:23:07.596785 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.596741 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"631d45316582157b2408ef0a28e83b233871944683684f4e27cc64c6288c37f1"} err="failed to get container status \"631d45316582157b2408ef0a28e83b233871944683684f4e27cc64c6288c37f1\": rpc error: code = NotFound desc = could not find container \"631d45316582157b2408ef0a28e83b233871944683684f4e27cc64c6288c37f1\": container with ID starting with 631d45316582157b2408ef0a28e83b233871944683684f4e27cc64c6288c37f1 not found: ID does not exist" Apr 28 20:23:07.596834 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.596816 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p"] Apr 28 20:23:07.601975 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.601956 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-b5d6db765-k2k9p"] Apr 28 20:23:07.728437 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.728417 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc_03d91aea-3f10-4501-93f1-7581e0a15fa6/storage-initializer/0.log" Apr 28 20:23:07.728783 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.728768 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:23:07.793110 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.793084 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmlm4\" (UniqueName: \"kubernetes.io/projected/03d91aea-3f10-4501-93f1-7581e0a15fa6-kube-api-access-cmlm4\") pod \"03d91aea-3f10-4501-93f1-7581e0a15fa6\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " Apr 28 20:23:07.793241 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.793137 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-dshm\") pod \"03d91aea-3f10-4501-93f1-7581e0a15fa6\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " Apr 28 20:23:07.793241 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.793185 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-tmp-dir\") pod \"03d91aea-3f10-4501-93f1-7581e0a15fa6\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " Apr 28 20:23:07.793241 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.793216 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-kserve-provision-location\") pod \"03d91aea-3f10-4501-93f1-7581e0a15fa6\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " Apr 28 20:23:07.793363 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.793255 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/03d91aea-3f10-4501-93f1-7581e0a15fa6-tls-certs\") pod \"03d91aea-3f10-4501-93f1-7581e0a15fa6\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " Apr 28 20:23:07.793363 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.793279 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-model-cache\") pod \"03d91aea-3f10-4501-93f1-7581e0a15fa6\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " Apr 28 20:23:07.793363 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.793308 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-home\") pod \"03d91aea-3f10-4501-93f1-7581e0a15fa6\" (UID: \"03d91aea-3f10-4501-93f1-7581e0a15fa6\") " Apr 28 20:23:07.793522 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.793446 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "03d91aea-3f10-4501-93f1-7581e0a15fa6" (UID: "03d91aea-3f10-4501-93f1-7581e0a15fa6"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:23:07.793613 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.793567 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-model-cache" (OuterVolumeSpecName: "model-cache") pod "03d91aea-3f10-4501-93f1-7581e0a15fa6" (UID: "03d91aea-3f10-4501-93f1-7581e0a15fa6"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:23:07.793666 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.793629 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-home" (OuterVolumeSpecName: "home") pod "03d91aea-3f10-4501-93f1-7581e0a15fa6" (UID: "03d91aea-3f10-4501-93f1-7581e0a15fa6"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:23:07.793706 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.793671 2565 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-tmp-dir\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:23:07.795221 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.795197 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-dshm" (OuterVolumeSpecName: "dshm") pod "03d91aea-3f10-4501-93f1-7581e0a15fa6" (UID: "03d91aea-3f10-4501-93f1-7581e0a15fa6"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:23:07.795605 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.795586 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03d91aea-3f10-4501-93f1-7581e0a15fa6-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "03d91aea-3f10-4501-93f1-7581e0a15fa6" (UID: "03d91aea-3f10-4501-93f1-7581e0a15fa6"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:23:07.795674 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.795611 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03d91aea-3f10-4501-93f1-7581e0a15fa6-kube-api-access-cmlm4" (OuterVolumeSpecName: "kube-api-access-cmlm4") pod "03d91aea-3f10-4501-93f1-7581e0a15fa6" (UID: "03d91aea-3f10-4501-93f1-7581e0a15fa6"). InnerVolumeSpecName "kube-api-access-cmlm4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:23:07.811317 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.811293 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "03d91aea-3f10-4501-93f1-7581e0a15fa6" (UID: "03d91aea-3f10-4501-93f1-7581e0a15fa6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:23:07.894510 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.894447 2565 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-home\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:23:07.894510 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.894470 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cmlm4\" (UniqueName: \"kubernetes.io/projected/03d91aea-3f10-4501-93f1-7581e0a15fa6-kube-api-access-cmlm4\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:23:07.894510 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.894480 2565 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-dshm\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:23:07.894510 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.894489 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-kserve-provision-location\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:23:07.894510 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.894497 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/03d91aea-3f10-4501-93f1-7581e0a15fa6-tls-certs\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:23:07.894510 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:07.894507 2565 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/03d91aea-3f10-4501-93f1-7581e0a15fa6-model-cache\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:23:08.111268 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:08.111228 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e040453-eac7-4f57-97d1-7a73dd2144f0" path="/var/lib/kubelet/pods/2e040453-eac7-4f57-97d1-7a73dd2144f0/volumes" Apr 28 20:23:08.580614 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:08.580588 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc_03d91aea-3f10-4501-93f1-7581e0a15fa6/storage-initializer/0.log" Apr 28 20:23:08.581088 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:08.581065 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" Apr 28 20:23:08.581216 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:08.581062 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc" event={"ID":"03d91aea-3f10-4501-93f1-7581e0a15fa6","Type":"ContainerDied","Data":"dc61581f24c6fb298de12cb51ac8fbbec476a62af1059dc367a64f311f345ba7"} Apr 28 20:23:08.581216 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:08.581211 2565 scope.go:117] "RemoveContainer" containerID="726388c4b3bcac27989be26311cbaf48ffce94ef7a7c9ab1204fd3c001b4dea4" Apr 28 20:23:08.611406 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:08.611389 2565 scope.go:117] "RemoveContainer" containerID="1561d4545b086e7d5a7cf9b969629a4159c1c0065eee24c3a2d254b476d9206c" Apr 28 20:23:08.617985 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:08.617961 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc"] Apr 28 20:23:08.620917 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:08.620894 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6bb7449db-c9bsc"] Apr 28 20:23:10.117464 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:23:10.117428 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03d91aea-3f10-4501-93f1-7581e0a15fa6" path="/var/lib/kubelet/pods/03d91aea-3f10-4501-93f1-7581e0a15fa6/volumes" Apr 28 20:24:17.551714 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:17.551635 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-65594cb6f6-dfc8f"] Apr 28 20:24:17.552217 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:17.551918 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-65594cb6f6-dfc8f" podUID="5b00c990-4a99-4916-8902-69f0f8865190" containerName="manager" containerID="cri-o://20592bed7893768a6c0eed6320110221dd9631efc78d57d5032dc95f0a5a4b60" gracePeriod=30 Apr 28 20:24:17.799234 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:17.799210 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-65594cb6f6-dfc8f" Apr 28 20:24:17.819273 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:17.819184 2565 generic.go:358] "Generic (PLEG): container finished" podID="5b00c990-4a99-4916-8902-69f0f8865190" containerID="20592bed7893768a6c0eed6320110221dd9631efc78d57d5032dc95f0a5a4b60" exitCode=0 Apr 28 20:24:17.819273 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:17.819269 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-65594cb6f6-dfc8f" Apr 28 20:24:17.819474 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:17.819287 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-65594cb6f6-dfc8f" event={"ID":"5b00c990-4a99-4916-8902-69f0f8865190","Type":"ContainerDied","Data":"20592bed7893768a6c0eed6320110221dd9631efc78d57d5032dc95f0a5a4b60"} Apr 28 20:24:17.819474 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:17.819316 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-65594cb6f6-dfc8f" event={"ID":"5b00c990-4a99-4916-8902-69f0f8865190","Type":"ContainerDied","Data":"0dfd673ac36423bede85b21d552ef3c97ce57b8e5d697afc7bd163d976e421e8"} Apr 28 20:24:17.819474 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:17.819335 2565 scope.go:117] "RemoveContainer" containerID="20592bed7893768a6c0eed6320110221dd9631efc78d57d5032dc95f0a5a4b60" Apr 28 20:24:17.827889 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:17.827863 2565 scope.go:117] "RemoveContainer" containerID="20592bed7893768a6c0eed6320110221dd9631efc78d57d5032dc95f0a5a4b60" Apr 28 20:24:17.828197 ip-10-0-141-41 kubenswrapper[2565]: E0428 20:24:17.828152 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20592bed7893768a6c0eed6320110221dd9631efc78d57d5032dc95f0a5a4b60\": container with ID starting with 20592bed7893768a6c0eed6320110221dd9631efc78d57d5032dc95f0a5a4b60 not found: ID does not exist" containerID="20592bed7893768a6c0eed6320110221dd9631efc78d57d5032dc95f0a5a4b60" Apr 28 20:24:17.828303 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:17.828211 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20592bed7893768a6c0eed6320110221dd9631efc78d57d5032dc95f0a5a4b60"} err="failed to get container status \"20592bed7893768a6c0eed6320110221dd9631efc78d57d5032dc95f0a5a4b60\": rpc error: code = NotFound desc = could not find container \"20592bed7893768a6c0eed6320110221dd9631efc78d57d5032dc95f0a5a4b60\": container with ID starting with 20592bed7893768a6c0eed6320110221dd9631efc78d57d5032dc95f0a5a4b60 not found: ID does not exist" Apr 28 20:24:17.928993 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:17.928964 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpvpb\" (UniqueName: \"kubernetes.io/projected/5b00c990-4a99-4916-8902-69f0f8865190-kube-api-access-tpvpb\") pod \"5b00c990-4a99-4916-8902-69f0f8865190\" (UID: \"5b00c990-4a99-4916-8902-69f0f8865190\") " Apr 28 20:24:17.929199 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:17.929020 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b00c990-4a99-4916-8902-69f0f8865190-cert\") pod \"5b00c990-4a99-4916-8902-69f0f8865190\" (UID: \"5b00c990-4a99-4916-8902-69f0f8865190\") " Apr 28 20:24:17.930983 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:17.930952 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b00c990-4a99-4916-8902-69f0f8865190-kube-api-access-tpvpb" (OuterVolumeSpecName: "kube-api-access-tpvpb") pod "5b00c990-4a99-4916-8902-69f0f8865190" (UID: "5b00c990-4a99-4916-8902-69f0f8865190"). InnerVolumeSpecName "kube-api-access-tpvpb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:24:17.930983 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:17.930970 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b00c990-4a99-4916-8902-69f0f8865190-cert" (OuterVolumeSpecName: "cert") pod "5b00c990-4a99-4916-8902-69f0f8865190" (UID: "5b00c990-4a99-4916-8902-69f0f8865190"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:24:18.030019 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:18.029984 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tpvpb\" (UniqueName: \"kubernetes.io/projected/5b00c990-4a99-4916-8902-69f0f8865190-kube-api-access-tpvpb\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:24:18.030019 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:18.030014 2565 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b00c990-4a99-4916-8902-69f0f8865190-cert\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:24:18.136852 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:18.136818 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-65594cb6f6-dfc8f"] Apr 28 20:24:18.140938 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:18.140913 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-65594cb6f6-dfc8f"] Apr 28 20:24:20.111065 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:20.111031 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b00c990-4a99-4916-8902-69f0f8865190" path="/var/lib/kubelet/pods/5b00c990-4a99-4916-8902-69f0f8865190/volumes" Apr 28 20:24:38.891904 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:38.891867 2565 generic.go:358] "Generic (PLEG): container finished" podID="987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" containerID="9ba5db7fbf0bf0c7a8c2b5db484d678a2b088e1421964badaa12dd62a0c10792" exitCode=0 Apr 28 20:24:38.892363 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:38.891939 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" event={"ID":"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7","Type":"ContainerDied","Data":"9ba5db7fbf0bf0c7a8c2b5db484d678a2b088e1421964badaa12dd62a0c10792"} Apr 28 20:24:39.898024 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:39.897984 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" event={"ID":"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7","Type":"ContainerStarted","Data":"618e443e769bd335648ac9ee6dd560ef6841fdbc24de13e7644d2b26a6e37cd0"} Apr 28 20:24:39.921544 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:39.921489 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" podStartSLOduration=112.921470512 podStartE2EDuration="1m52.921470512s" podCreationTimestamp="2026-04-28 20:22:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:24:39.918576242 +0000 UTC m=+4064.424292945" watchObservedRunningTime="2026-04-28 20:24:39.921470512 +0000 UTC m=+4064.427187182" Apr 28 20:24:46.930901 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:46.930855 2565 generic.go:358] "Generic (PLEG): container finished" podID="01fdfbc0-ffda-4f56-99aa-7a0fab093146" containerID="cfeeaf4bc875b56a2153b2ad8e8dbed68f1312fa23078af817d2550c1f772988" exitCode=0 Apr 28 20:24:46.930901 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:46.930895 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" event={"ID":"01fdfbc0-ffda-4f56-99aa-7a0fab093146","Type":"ContainerDied","Data":"cfeeaf4bc875b56a2153b2ad8e8dbed68f1312fa23078af817d2550c1f772988"} Apr 28 20:24:47.936659 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:47.936624 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" event={"ID":"01fdfbc0-ffda-4f56-99aa-7a0fab093146","Type":"ContainerStarted","Data":"cdc070073ba237b8f94a075ef311014c46fa69f4999bed2c5a8fa6e73128985d"} Apr 28 20:24:47.962501 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:47.962440 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" podStartSLOduration=120.962420137 podStartE2EDuration="2m0.962420137s" podCreationTimestamp="2026-04-28 20:22:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:24:47.959020863 +0000 UTC m=+4072.464737522" watchObservedRunningTime="2026-04-28 20:24:47.962420137 +0000 UTC m=+4072.468136804" Apr 28 20:24:48.013477 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:48.013442 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:24:48.013477 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:48.013477 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:24:48.013807 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:48.013756 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" podUID="987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 28 20:24:48.031224 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:48.031188 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:24:48.031366 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:48.031240 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:24:48.032609 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:48.032582 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" podUID="01fdfbc0-ffda-4f56-99aa-7a0fab093146" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 28 20:24:54.373070 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.373033 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 28 20:24:54.373670 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.373646 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e040453-eac7-4f57-97d1-7a73dd2144f0" containerName="storage-initializer" Apr 28 20:24:54.373723 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.373676 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e040453-eac7-4f57-97d1-7a73dd2144f0" containerName="storage-initializer" Apr 28 20:24:54.373723 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.373710 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03d91aea-3f10-4501-93f1-7581e0a15fa6" containerName="storage-initializer" Apr 28 20:24:54.373723 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.373719 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="03d91aea-3f10-4501-93f1-7581e0a15fa6" containerName="storage-initializer" Apr 28 20:24:54.373832 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.373733 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b00c990-4a99-4916-8902-69f0f8865190" containerName="manager" Apr 28 20:24:54.373832 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.373741 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b00c990-4a99-4916-8902-69f0f8865190" containerName="manager" Apr 28 20:24:54.373832 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.373757 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03d91aea-3f10-4501-93f1-7581e0a15fa6" containerName="llm-d-routing-sidecar" Apr 28 20:24:54.373832 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.373766 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="03d91aea-3f10-4501-93f1-7581e0a15fa6" containerName="llm-d-routing-sidecar" Apr 28 20:24:54.373832 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.373783 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e040453-eac7-4f57-97d1-7a73dd2144f0" containerName="main" Apr 28 20:24:54.373832 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.373792 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e040453-eac7-4f57-97d1-7a73dd2144f0" containerName="main" Apr 28 20:24:54.374019 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.373872 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="03d91aea-3f10-4501-93f1-7581e0a15fa6" containerName="llm-d-routing-sidecar" Apr 28 20:24:54.374019 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.373885 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e040453-eac7-4f57-97d1-7a73dd2144f0" containerName="main" Apr 28 20:24:54.374019 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.373899 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b00c990-4a99-4916-8902-69f0f8865190" containerName="manager" Apr 28 20:24:54.374019 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.373912 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="03d91aea-3f10-4501-93f1-7581e0a15fa6" containerName="storage-initializer" Apr 28 20:24:54.378407 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.378384 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:24:54.381292 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.381265 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-bvrgh\"" Apr 28 20:24:54.381415 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.381286 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 28 20:24:54.386473 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.386452 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 28 20:24:54.460208 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.460155 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:24:54.460400 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.460241 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trzrw\" (UniqueName: \"kubernetes.io/projected/c6bd14c6-3e0d-482e-806f-48528afa6c3b-kube-api-access-trzrw\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:24:54.460400 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.460265 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c6bd14c6-3e0d-482e-806f-48528afa6c3b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:24:54.460400 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.460334 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:24:54.460400 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.460360 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:24:54.460400 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.460386 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:24:54.460400 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.460401 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:24:54.560998 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.560949 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:24:54.560998 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.560997 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:24:54.561328 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.561013 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:24:54.561328 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.561034 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:24:54.561328 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.561095 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:24:54.561328 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.561148 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trzrw\" (UniqueName: \"kubernetes.io/projected/c6bd14c6-3e0d-482e-806f-48528afa6c3b-kube-api-access-trzrw\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:24:54.561328 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.561190 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c6bd14c6-3e0d-482e-806f-48528afa6c3b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:24:54.561617 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.561425 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:24:54.561617 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.561457 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:24:54.561734 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.561630 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:24:54.561799 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.561759 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:24:54.563499 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.563466 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:24:54.564021 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.563986 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c6bd14c6-3e0d-482e-806f-48528afa6c3b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:24:54.568745 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.568716 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trzrw\" (UniqueName: \"kubernetes.io/projected/c6bd14c6-3e0d-482e-806f-48528afa6c3b-kube-api-access-trzrw\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:24:54.692830 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.692742 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:24:54.830014 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.829977 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 28 20:24:54.833706 ip-10-0-141-41 kubenswrapper[2565]: W0428 20:24:54.833674 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6bd14c6_3e0d_482e_806f_48528afa6c3b.slice/crio-f2ee0f79c77a24b764580cb47600893fbb5b6daab9000dfb0908a0e3ad5e825c WatchSource:0}: Error finding container f2ee0f79c77a24b764580cb47600893fbb5b6daab9000dfb0908a0e3ad5e825c: Status 404 returned error can't find the container with id f2ee0f79c77a24b764580cb47600893fbb5b6daab9000dfb0908a0e3ad5e825c Apr 28 20:24:54.965056 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.964949 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"c6bd14c6-3e0d-482e-806f-48528afa6c3b","Type":"ContainerStarted","Data":"8f613de411272038e3caf60f56eb4d2c1b6a11e03b85f28d051a33b9b846faa3"} Apr 28 20:24:54.965056 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:54.964999 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"c6bd14c6-3e0d-482e-806f-48528afa6c3b","Type":"ContainerStarted","Data":"f2ee0f79c77a24b764580cb47600893fbb5b6daab9000dfb0908a0e3ad5e825c"} Apr 28 20:24:58.013755 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:58.013706 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" podUID="987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 28 20:24:58.030899 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:24:58.030846 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" podUID="01fdfbc0-ffda-4f56-99aa-7a0fab093146" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 28 20:25:08.013214 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:25:08.013137 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" podUID="987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 28 20:25:08.031123 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:25:08.031079 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" podUID="01fdfbc0-ffda-4f56-99aa-7a0fab093146" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 28 20:25:18.014015 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:25:18.013966 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" podUID="987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 28 20:25:18.031093 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:25:18.031049 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" podUID="01fdfbc0-ffda-4f56-99aa-7a0fab093146" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 28 20:25:28.013268 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:25:28.013202 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" podUID="987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 28 20:25:28.031677 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:25:28.031638 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" podUID="01fdfbc0-ffda-4f56-99aa-7a0fab093146" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 28 20:25:38.013285 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:25:38.013237 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" podUID="987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 28 20:25:38.031503 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:25:38.031464 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" podUID="01fdfbc0-ffda-4f56-99aa-7a0fab093146" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 28 20:25:48.013127 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:25:48.013082 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" podUID="987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 28 20:25:48.030978 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:25:48.030945 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" podUID="01fdfbc0-ffda-4f56-99aa-7a0fab093146" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 28 20:25:58.013734 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:25:58.013680 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" podUID="987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 28 20:25:58.031130 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:25:58.031094 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" podUID="01fdfbc0-ffda-4f56-99aa-7a0fab093146" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 28 20:26:08.013392 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:26:08.013333 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" podUID="987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 28 20:26:08.031476 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:26:08.031437 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" podUID="01fdfbc0-ffda-4f56-99aa-7a0fab093146" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 28 20:26:18.013626 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:26:18.013578 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" podUID="987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 28 20:26:18.031669 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:26:18.031629 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" podUID="01fdfbc0-ffda-4f56-99aa-7a0fab093146" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 28 20:26:28.013370 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:26:28.013320 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" podUID="987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8001/health\": dial tcp 10.134.0.45:8001: connect: connection refused" Apr 28 20:26:28.031247 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:26:28.031214 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" podUID="01fdfbc0-ffda-4f56-99aa-7a0fab093146" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 28 20:26:38.023885 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:26:38.023850 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:26:38.030751 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:26:38.030719 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" podUID="01fdfbc0-ffda-4f56-99aa-7a0fab093146" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 28 20:26:38.036663 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:26:38.036642 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:26:48.031208 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:26:48.031147 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" podUID="01fdfbc0-ffda-4f56-99aa-7a0fab093146" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 28 20:26:58.041819 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:26:58.041789 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:26:58.049631 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:26:58.049605 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:27:09.480644 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:09.480613 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp"] Apr 28 20:27:09.481243 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:09.480896 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" podUID="01fdfbc0-ffda-4f56-99aa-7a0fab093146" containerName="main" containerID="cri-o://cdc070073ba237b8f94a075ef311014c46fa69f4999bed2c5a8fa6e73128985d" gracePeriod=30 Apr 28 20:27:09.486155 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:09.486127 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm"] Apr 28 20:27:09.486518 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:09.486479 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" podUID="987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" containerName="main" containerID="cri-o://618e443e769bd335648ac9ee6dd560ef6841fdbc24de13e7644d2b26a6e37cd0" gracePeriod=30 Apr 28 20:27:39.487102 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.487041 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" podUID="987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" containerName="llm-d-routing-sidecar" containerID="cri-o://e97134452470938c3d885211b958cfa69bfee84d0fdb5efcd65988c3d710e11d" gracePeriod=2 Apr 28 20:27:39.773643 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.773623 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b796954f4-44vdm_987dfbe2-20b7-4b7d-956e-cd6b8f618ba7/main/0.log" Apr 28 20:27:39.774295 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.774278 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:27:39.777286 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.777272 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:27:39.836282 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.836259 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-dshm\") pod \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " Apr 28 20:27:39.836451 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.836297 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-home\") pod \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " Apr 28 20:27:39.836451 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.836331 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-home\") pod \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " Apr 28 20:27:39.836451 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.836365 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj97q\" (UniqueName: \"kubernetes.io/projected/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-kube-api-access-kj97q\") pod \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " Apr 28 20:27:39.836451 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.836397 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8jzw\" (UniqueName: \"kubernetes.io/projected/01fdfbc0-ffda-4f56-99aa-7a0fab093146-kube-api-access-d8jzw\") pod \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " Apr 28 20:27:39.836451 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.836423 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-kserve-provision-location\") pod \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " Apr 28 20:27:39.836749 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.836448 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-tls-certs\") pod \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " Apr 28 20:27:39.836749 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.836500 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-model-cache\") pod \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " Apr 28 20:27:39.837381 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.836879 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-dshm\") pod \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " Apr 28 20:27:39.837381 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.836912 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-tmp-dir\") pod \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " Apr 28 20:27:39.837381 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.836949 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-tmp-dir\") pod \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " Apr 28 20:27:39.837381 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.836983 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-kserve-provision-location\") pod \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " Apr 28 20:27:39.837381 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.836989 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-home" (OuterVolumeSpecName: "home") pod "987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" (UID: "987dfbe2-20b7-4b7d-956e-cd6b8f618ba7"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:27:39.837381 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.837025 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/01fdfbc0-ffda-4f56-99aa-7a0fab093146-tls-certs\") pod \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\" (UID: \"01fdfbc0-ffda-4f56-99aa-7a0fab093146\") " Apr 28 20:27:39.837381 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.837140 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-model-cache\") pod \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\" (UID: \"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7\") " Apr 28 20:27:39.837381 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.837257 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-home" (OuterVolumeSpecName: "home") pod "01fdfbc0-ffda-4f56-99aa-7a0fab093146" (UID: "01fdfbc0-ffda-4f56-99aa-7a0fab093146"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:27:39.837942 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.837584 2565 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-home\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:27:39.837942 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.837604 2565 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-home\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:27:39.837942 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.837839 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-model-cache" (OuterVolumeSpecName: "model-cache") pod "987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" (UID: "987dfbe2-20b7-4b7d-956e-cd6b8f618ba7"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:27:39.838244 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.838144 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-model-cache" (OuterVolumeSpecName: "model-cache") pod "01fdfbc0-ffda-4f56-99aa-7a0fab093146" (UID: "01fdfbc0-ffda-4f56-99aa-7a0fab093146"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:27:39.839067 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.839025 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01fdfbc0-ffda-4f56-99aa-7a0fab093146-kube-api-access-d8jzw" (OuterVolumeSpecName: "kube-api-access-d8jzw") pod "01fdfbc0-ffda-4f56-99aa-7a0fab093146" (UID: "01fdfbc0-ffda-4f56-99aa-7a0fab093146"). InnerVolumeSpecName "kube-api-access-d8jzw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:27:39.840570 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.840524 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" (UID: "987dfbe2-20b7-4b7d-956e-cd6b8f618ba7"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:27:39.840570 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.840552 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-kube-api-access-kj97q" (OuterVolumeSpecName: "kube-api-access-kj97q") pod "987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" (UID: "987dfbe2-20b7-4b7d-956e-cd6b8f618ba7"). InnerVolumeSpecName "kube-api-access-kj97q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:27:39.840728 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.840556 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-dshm" (OuterVolumeSpecName: "dshm") pod "987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" (UID: "987dfbe2-20b7-4b7d-956e-cd6b8f618ba7"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:27:39.840938 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.840898 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-dshm" (OuterVolumeSpecName: "dshm") pod "01fdfbc0-ffda-4f56-99aa-7a0fab093146" (UID: "01fdfbc0-ffda-4f56-99aa-7a0fab093146"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:27:39.841460 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.841427 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01fdfbc0-ffda-4f56-99aa-7a0fab093146-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "01fdfbc0-ffda-4f56-99aa-7a0fab093146" (UID: "01fdfbc0-ffda-4f56-99aa-7a0fab093146"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:27:39.849535 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.849512 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "01fdfbc0-ffda-4f56-99aa-7a0fab093146" (UID: "01fdfbc0-ffda-4f56-99aa-7a0fab093146"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:27:39.851693 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.851671 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" (UID: "987dfbe2-20b7-4b7d-956e-cd6b8f618ba7"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:27:39.900682 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.900643 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "01fdfbc0-ffda-4f56-99aa-7a0fab093146" (UID: "01fdfbc0-ffda-4f56-99aa-7a0fab093146"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:27:39.902988 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.902957 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" (UID: "987dfbe2-20b7-4b7d-956e-cd6b8f618ba7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:27:39.938977 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.938950 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/01fdfbc0-ffda-4f56-99aa-7a0fab093146-tls-certs\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:27:39.938977 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.938979 2565 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-model-cache\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:27:39.939188 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.938992 2565 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-dshm\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:27:39.939188 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.939006 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kj97q\" (UniqueName: \"kubernetes.io/projected/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-kube-api-access-kj97q\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:27:39.939188 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.939022 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d8jzw\" (UniqueName: \"kubernetes.io/projected/01fdfbc0-ffda-4f56-99aa-7a0fab093146-kube-api-access-d8jzw\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:27:39.939188 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.939039 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-kserve-provision-location\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:27:39.939188 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.939052 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-tls-certs\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:27:39.939188 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.939065 2565 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-model-cache\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:27:39.939188 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.939078 2565 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-dshm\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:27:39.939188 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.939089 2565 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-tmp-dir\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:27:39.939188 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.939101 2565 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7-tmp-dir\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:27:39.939188 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:39.939112 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01fdfbc0-ffda-4f56-99aa-7a0fab093146-kserve-provision-location\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:27:40.580337 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.580301 2565 generic.go:358] "Generic (PLEG): container finished" podID="01fdfbc0-ffda-4f56-99aa-7a0fab093146" containerID="cdc070073ba237b8f94a075ef311014c46fa69f4999bed2c5a8fa6e73128985d" exitCode=137 Apr 28 20:27:40.580801 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.580385 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" Apr 28 20:27:40.580801 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.580389 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" event={"ID":"01fdfbc0-ffda-4f56-99aa-7a0fab093146","Type":"ContainerDied","Data":"cdc070073ba237b8f94a075ef311014c46fa69f4999bed2c5a8fa6e73128985d"} Apr 28 20:27:40.580801 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.580501 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp" event={"ID":"01fdfbc0-ffda-4f56-99aa-7a0fab093146","Type":"ContainerDied","Data":"8d0e1c06a561ebc462eaa14aa9e95becac7ca830ace76177ae960bdcea68268f"} Apr 28 20:27:40.580801 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.580526 2565 scope.go:117] "RemoveContainer" containerID="cdc070073ba237b8f94a075ef311014c46fa69f4999bed2c5a8fa6e73128985d" Apr 28 20:27:40.581827 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.581812 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b796954f4-44vdm_987dfbe2-20b7-4b7d-956e-cd6b8f618ba7/main/0.log" Apr 28 20:27:40.582507 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.582485 2565 generic.go:358] "Generic (PLEG): container finished" podID="987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" containerID="618e443e769bd335648ac9ee6dd560ef6841fdbc24de13e7644d2b26a6e37cd0" exitCode=137 Apr 28 20:27:40.582625 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.582509 2565 generic.go:358] "Generic (PLEG): container finished" podID="987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" containerID="e97134452470938c3d885211b958cfa69bfee84d0fdb5efcd65988c3d710e11d" exitCode=0 Apr 28 20:27:40.582625 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.582564 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" event={"ID":"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7","Type":"ContainerDied","Data":"618e443e769bd335648ac9ee6dd560ef6841fdbc24de13e7644d2b26a6e37cd0"} Apr 28 20:27:40.582625 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.582581 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" Apr 28 20:27:40.582933 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.582589 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" event={"ID":"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7","Type":"ContainerDied","Data":"e97134452470938c3d885211b958cfa69bfee84d0fdb5efcd65988c3d710e11d"} Apr 28 20:27:40.583047 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.582973 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm" event={"ID":"987dfbe2-20b7-4b7d-956e-cd6b8f618ba7","Type":"ContainerDied","Data":"f8b7cc9ca11e0f75cf9b85298f86332d04c988a6f0937f7af0876220c56ce095"} Apr 28 20:27:40.590928 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.590893 2565 scope.go:117] "RemoveContainer" containerID="cfeeaf4bc875b56a2153b2ad8e8dbed68f1312fa23078af817d2550c1f772988" Apr 28 20:27:40.601436 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.601412 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm"] Apr 28 20:27:40.604506 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.604478 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b796954f4-44vdm"] Apr 28 20:27:40.615156 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.615133 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp"] Apr 28 20:27:40.620093 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.620073 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-8468cf7cf7-st9hp"] Apr 28 20:27:40.654378 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.654359 2565 scope.go:117] "RemoveContainer" containerID="cdc070073ba237b8f94a075ef311014c46fa69f4999bed2c5a8fa6e73128985d" Apr 28 20:27:40.654668 ip-10-0-141-41 kubenswrapper[2565]: E0428 20:27:40.654650 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdc070073ba237b8f94a075ef311014c46fa69f4999bed2c5a8fa6e73128985d\": container with ID starting with cdc070073ba237b8f94a075ef311014c46fa69f4999bed2c5a8fa6e73128985d not found: ID does not exist" containerID="cdc070073ba237b8f94a075ef311014c46fa69f4999bed2c5a8fa6e73128985d" Apr 28 20:27:40.654746 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.654683 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdc070073ba237b8f94a075ef311014c46fa69f4999bed2c5a8fa6e73128985d"} err="failed to get container status \"cdc070073ba237b8f94a075ef311014c46fa69f4999bed2c5a8fa6e73128985d\": rpc error: code = NotFound desc = could not find container \"cdc070073ba237b8f94a075ef311014c46fa69f4999bed2c5a8fa6e73128985d\": container with ID starting with cdc070073ba237b8f94a075ef311014c46fa69f4999bed2c5a8fa6e73128985d not found: ID does not exist" Apr 28 20:27:40.654746 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.654709 2565 scope.go:117] "RemoveContainer" containerID="cfeeaf4bc875b56a2153b2ad8e8dbed68f1312fa23078af817d2550c1f772988" Apr 28 20:27:40.654940 ip-10-0-141-41 kubenswrapper[2565]: E0428 20:27:40.654924 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfeeaf4bc875b56a2153b2ad8e8dbed68f1312fa23078af817d2550c1f772988\": container with ID starting with cfeeaf4bc875b56a2153b2ad8e8dbed68f1312fa23078af817d2550c1f772988 not found: ID does not exist" containerID="cfeeaf4bc875b56a2153b2ad8e8dbed68f1312fa23078af817d2550c1f772988" Apr 28 20:27:40.655001 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.654947 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfeeaf4bc875b56a2153b2ad8e8dbed68f1312fa23078af817d2550c1f772988"} err="failed to get container status \"cfeeaf4bc875b56a2153b2ad8e8dbed68f1312fa23078af817d2550c1f772988\": rpc error: code = NotFound desc = could not find container \"cfeeaf4bc875b56a2153b2ad8e8dbed68f1312fa23078af817d2550c1f772988\": container with ID starting with cfeeaf4bc875b56a2153b2ad8e8dbed68f1312fa23078af817d2550c1f772988 not found: ID does not exist" Apr 28 20:27:40.655001 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.654968 2565 scope.go:117] "RemoveContainer" containerID="618e443e769bd335648ac9ee6dd560ef6841fdbc24de13e7644d2b26a6e37cd0" Apr 28 20:27:40.662631 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.662616 2565 scope.go:117] "RemoveContainer" containerID="9ba5db7fbf0bf0c7a8c2b5db484d678a2b088e1421964badaa12dd62a0c10792" Apr 28 20:27:40.728865 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.728846 2565 scope.go:117] "RemoveContainer" containerID="e97134452470938c3d885211b958cfa69bfee84d0fdb5efcd65988c3d710e11d" Apr 28 20:27:40.735886 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.735867 2565 scope.go:117] "RemoveContainer" containerID="618e443e769bd335648ac9ee6dd560ef6841fdbc24de13e7644d2b26a6e37cd0" Apr 28 20:27:40.736128 ip-10-0-141-41 kubenswrapper[2565]: E0428 20:27:40.736105 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"618e443e769bd335648ac9ee6dd560ef6841fdbc24de13e7644d2b26a6e37cd0\": container with ID starting with 618e443e769bd335648ac9ee6dd560ef6841fdbc24de13e7644d2b26a6e37cd0 not found: ID does not exist" containerID="618e443e769bd335648ac9ee6dd560ef6841fdbc24de13e7644d2b26a6e37cd0" Apr 28 20:27:40.736194 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.736135 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"618e443e769bd335648ac9ee6dd560ef6841fdbc24de13e7644d2b26a6e37cd0"} err="failed to get container status \"618e443e769bd335648ac9ee6dd560ef6841fdbc24de13e7644d2b26a6e37cd0\": rpc error: code = NotFound desc = could not find container \"618e443e769bd335648ac9ee6dd560ef6841fdbc24de13e7644d2b26a6e37cd0\": container with ID starting with 618e443e769bd335648ac9ee6dd560ef6841fdbc24de13e7644d2b26a6e37cd0 not found: ID does not exist" Apr 28 20:27:40.736194 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.736156 2565 scope.go:117] "RemoveContainer" containerID="9ba5db7fbf0bf0c7a8c2b5db484d678a2b088e1421964badaa12dd62a0c10792" Apr 28 20:27:40.736406 ip-10-0-141-41 kubenswrapper[2565]: E0428 20:27:40.736392 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ba5db7fbf0bf0c7a8c2b5db484d678a2b088e1421964badaa12dd62a0c10792\": container with ID starting with 9ba5db7fbf0bf0c7a8c2b5db484d678a2b088e1421964badaa12dd62a0c10792 not found: ID does not exist" containerID="9ba5db7fbf0bf0c7a8c2b5db484d678a2b088e1421964badaa12dd62a0c10792" Apr 28 20:27:40.736449 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.736408 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ba5db7fbf0bf0c7a8c2b5db484d678a2b088e1421964badaa12dd62a0c10792"} err="failed to get container status \"9ba5db7fbf0bf0c7a8c2b5db484d678a2b088e1421964badaa12dd62a0c10792\": rpc error: code = NotFound desc = could not find container \"9ba5db7fbf0bf0c7a8c2b5db484d678a2b088e1421964badaa12dd62a0c10792\": container with ID starting with 9ba5db7fbf0bf0c7a8c2b5db484d678a2b088e1421964badaa12dd62a0c10792 not found: ID does not exist" Apr 28 20:27:40.736449 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.736420 2565 scope.go:117] "RemoveContainer" containerID="e97134452470938c3d885211b958cfa69bfee84d0fdb5efcd65988c3d710e11d" Apr 28 20:27:40.736627 ip-10-0-141-41 kubenswrapper[2565]: E0428 20:27:40.736610 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e97134452470938c3d885211b958cfa69bfee84d0fdb5efcd65988c3d710e11d\": container with ID starting with e97134452470938c3d885211b958cfa69bfee84d0fdb5efcd65988c3d710e11d not found: ID does not exist" containerID="e97134452470938c3d885211b958cfa69bfee84d0fdb5efcd65988c3d710e11d" Apr 28 20:27:40.736671 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.736630 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e97134452470938c3d885211b958cfa69bfee84d0fdb5efcd65988c3d710e11d"} err="failed to get container status \"e97134452470938c3d885211b958cfa69bfee84d0fdb5efcd65988c3d710e11d\": rpc error: code = NotFound desc = could not find container \"e97134452470938c3d885211b958cfa69bfee84d0fdb5efcd65988c3d710e11d\": container with ID starting with e97134452470938c3d885211b958cfa69bfee84d0fdb5efcd65988c3d710e11d not found: ID does not exist" Apr 28 20:27:40.736671 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.736642 2565 scope.go:117] "RemoveContainer" containerID="618e443e769bd335648ac9ee6dd560ef6841fdbc24de13e7644d2b26a6e37cd0" Apr 28 20:27:40.736887 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.736866 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"618e443e769bd335648ac9ee6dd560ef6841fdbc24de13e7644d2b26a6e37cd0"} err="failed to get container status \"618e443e769bd335648ac9ee6dd560ef6841fdbc24de13e7644d2b26a6e37cd0\": rpc error: code = NotFound desc = could not find container \"618e443e769bd335648ac9ee6dd560ef6841fdbc24de13e7644d2b26a6e37cd0\": container with ID starting with 618e443e769bd335648ac9ee6dd560ef6841fdbc24de13e7644d2b26a6e37cd0 not found: ID does not exist" Apr 28 20:27:40.736887 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.736887 2565 scope.go:117] "RemoveContainer" containerID="9ba5db7fbf0bf0c7a8c2b5db484d678a2b088e1421964badaa12dd62a0c10792" Apr 28 20:27:40.737066 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.737048 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ba5db7fbf0bf0c7a8c2b5db484d678a2b088e1421964badaa12dd62a0c10792"} err="failed to get container status \"9ba5db7fbf0bf0c7a8c2b5db484d678a2b088e1421964badaa12dd62a0c10792\": rpc error: code = NotFound desc = could not find container \"9ba5db7fbf0bf0c7a8c2b5db484d678a2b088e1421964badaa12dd62a0c10792\": container with ID starting with 9ba5db7fbf0bf0c7a8c2b5db484d678a2b088e1421964badaa12dd62a0c10792 not found: ID does not exist" Apr 28 20:27:40.737066 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.737065 2565 scope.go:117] "RemoveContainer" containerID="e97134452470938c3d885211b958cfa69bfee84d0fdb5efcd65988c3d710e11d" Apr 28 20:27:40.737359 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:40.737339 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e97134452470938c3d885211b958cfa69bfee84d0fdb5efcd65988c3d710e11d"} err="failed to get container status \"e97134452470938c3d885211b958cfa69bfee84d0fdb5efcd65988c3d710e11d\": rpc error: code = NotFound desc = could not find container \"e97134452470938c3d885211b958cfa69bfee84d0fdb5efcd65988c3d710e11d\": container with ID starting with e97134452470938c3d885211b958cfa69bfee84d0fdb5efcd65988c3d710e11d not found: ID does not exist" Apr 28 20:27:42.111047 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:42.111008 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01fdfbc0-ffda-4f56-99aa-7a0fab093146" path="/var/lib/kubelet/pods/01fdfbc0-ffda-4f56-99aa-7a0fab093146/volumes" Apr 28 20:27:42.111495 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:27:42.111480 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" path="/var/lib/kubelet/pods/987dfbe2-20b7-4b7d-956e-cd6b8f618ba7/volumes" Apr 28 20:35:52.234124 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:35:52.234091 2565 generic.go:358] "Generic (PLEG): container finished" podID="c6bd14c6-3e0d-482e-806f-48528afa6c3b" containerID="8f613de411272038e3caf60f56eb4d2c1b6a11e03b85f28d051a33b9b846faa3" exitCode=0 Apr 28 20:35:52.234479 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:35:52.234173 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"c6bd14c6-3e0d-482e-806f-48528afa6c3b","Type":"ContainerDied","Data":"8f613de411272038e3caf60f56eb4d2c1b6a11e03b85f28d051a33b9b846faa3"} Apr 28 20:35:52.238048 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:35:52.238029 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 20:35:53.239459 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:35:53.239421 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"c6bd14c6-3e0d-482e-806f-48528afa6c3b","Type":"ContainerStarted","Data":"a3319c840715f955bb94ac1be0ab0087c50263234c522769a69ff7297f8043ae"} Apr 28 20:35:53.263389 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:35:53.263337 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podStartSLOduration=659.263317311 podStartE2EDuration="10m59.263317311s" podCreationTimestamp="2026-04-28 20:24:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:35:53.261618289 +0000 UTC m=+4737.767334975" watchObservedRunningTime="2026-04-28 20:35:53.263317311 +0000 UTC m=+4737.769033976" Apr 28 20:35:54.693494 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:35:54.693459 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:35:54.693891 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:35:54.693504 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:35:54.695226 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:35:54.695200 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c6bd14c6-3e0d-482e-806f-48528afa6c3b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 28 20:36:04.693927 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:36:04.693871 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c6bd14c6-3e0d-482e-806f-48528afa6c3b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 28 20:36:14.693915 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:36:14.693823 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c6bd14c6-3e0d-482e-806f-48528afa6c3b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 28 20:36:24.694059 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:36:24.694017 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c6bd14c6-3e0d-482e-806f-48528afa6c3b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 28 20:36:34.693670 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:36:34.693622 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c6bd14c6-3e0d-482e-806f-48528afa6c3b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 28 20:36:44.694198 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:36:44.694130 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c6bd14c6-3e0d-482e-806f-48528afa6c3b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 28 20:36:54.693519 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:36:54.693472 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c6bd14c6-3e0d-482e-806f-48528afa6c3b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 28 20:37:04.693327 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:04.693285 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c6bd14c6-3e0d-482e-806f-48528afa6c3b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 28 20:37:14.693541 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:14.693488 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c6bd14c6-3e0d-482e-806f-48528afa6c3b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 28 20:37:24.702730 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:24.702695 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:37:24.710521 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:24.710495 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:37:32.470806 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:32.470773 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 28 20:37:32.471212 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:32.471064 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c6bd14c6-3e0d-482e-806f-48528afa6c3b" containerName="main" containerID="cri-o://a3319c840715f955bb94ac1be0ab0087c50263234c522769a69ff7297f8043ae" gracePeriod=30 Apr 28 20:37:33.217290 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.217264 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:37:33.310678 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.310638 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-kserve-provision-location\") pod \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " Apr 28 20:37:33.310678 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.310683 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-dshm\") pod \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " Apr 28 20:37:33.310933 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.310746 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-model-cache\") pod \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " Apr 28 20:37:33.310933 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.310779 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trzrw\" (UniqueName: \"kubernetes.io/projected/c6bd14c6-3e0d-482e-806f-48528afa6c3b-kube-api-access-trzrw\") pod \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " Apr 28 20:37:33.310933 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.310824 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-home\") pod \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " Apr 28 20:37:33.310933 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.310876 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-tmp-dir\") pod \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " Apr 28 20:37:33.310933 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.310921 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c6bd14c6-3e0d-482e-806f-48528afa6c3b-tls-certs\") pod \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\" (UID: \"c6bd14c6-3e0d-482e-806f-48528afa6c3b\") " Apr 28 20:37:33.311193 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.310993 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-model-cache" (OuterVolumeSpecName: "model-cache") pod "c6bd14c6-3e0d-482e-806f-48528afa6c3b" (UID: "c6bd14c6-3e0d-482e-806f-48528afa6c3b"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:37:33.311253 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.311222 2565 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-model-cache\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:37:33.311647 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.311613 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-home" (OuterVolumeSpecName: "home") pod "c6bd14c6-3e0d-482e-806f-48528afa6c3b" (UID: "c6bd14c6-3e0d-482e-806f-48528afa6c3b"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:37:33.312987 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.312956 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-dshm" (OuterVolumeSpecName: "dshm") pod "c6bd14c6-3e0d-482e-806f-48528afa6c3b" (UID: "c6bd14c6-3e0d-482e-806f-48528afa6c3b"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:37:33.313122 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.313043 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6bd14c6-3e0d-482e-806f-48528afa6c3b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c6bd14c6-3e0d-482e-806f-48528afa6c3b" (UID: "c6bd14c6-3e0d-482e-806f-48528afa6c3b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:37:33.313287 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.313251 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6bd14c6-3e0d-482e-806f-48528afa6c3b-kube-api-access-trzrw" (OuterVolumeSpecName: "kube-api-access-trzrw") pod "c6bd14c6-3e0d-482e-806f-48528afa6c3b" (UID: "c6bd14c6-3e0d-482e-806f-48528afa6c3b"). InnerVolumeSpecName "kube-api-access-trzrw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:37:33.323599 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.323573 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "c6bd14c6-3e0d-482e-806f-48528afa6c3b" (UID: "c6bd14c6-3e0d-482e-806f-48528afa6c3b"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:37:33.367658 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.367629 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d548q/must-gather-cv6x2"] Apr 28 20:37:33.368002 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.367991 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01fdfbc0-ffda-4f56-99aa-7a0fab093146" containerName="storage-initializer" Apr 28 20:37:33.368050 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.368005 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="01fdfbc0-ffda-4f56-99aa-7a0fab093146" containerName="storage-initializer" Apr 28 20:37:33.368050 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.368013 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01fdfbc0-ffda-4f56-99aa-7a0fab093146" containerName="main" Apr 28 20:37:33.368050 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.368019 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="01fdfbc0-ffda-4f56-99aa-7a0fab093146" containerName="main" Apr 28 20:37:33.368050 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.368029 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6bd14c6-3e0d-482e-806f-48528afa6c3b" containerName="main" Apr 28 20:37:33.368050 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.368035 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6bd14c6-3e0d-482e-806f-48528afa6c3b" containerName="main" Apr 28 20:37:33.368050 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.368046 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" containerName="main" Apr 28 20:37:33.368050 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.368052 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" containerName="main" Apr 28 20:37:33.368355 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.368062 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" containerName="storage-initializer" Apr 28 20:37:33.368355 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.368068 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" containerName="storage-initializer" Apr 28 20:37:33.368355 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.368080 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6bd14c6-3e0d-482e-806f-48528afa6c3b" containerName="storage-initializer" Apr 28 20:37:33.368355 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.368085 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6bd14c6-3e0d-482e-806f-48528afa6c3b" containerName="storage-initializer" Apr 28 20:37:33.368355 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.368092 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" containerName="llm-d-routing-sidecar" Apr 28 20:37:33.368355 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.368097 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" containerName="llm-d-routing-sidecar" Apr 28 20:37:33.368355 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.368148 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" containerName="main" Apr 28 20:37:33.368355 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.368155 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="987dfbe2-20b7-4b7d-956e-cd6b8f618ba7" containerName="llm-d-routing-sidecar" Apr 28 20:37:33.368355 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.368189 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="c6bd14c6-3e0d-482e-806f-48528afa6c3b" containerName="main" Apr 28 20:37:33.368355 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.368198 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="01fdfbc0-ffda-4f56-99aa-7a0fab093146" containerName="main" Apr 28 20:37:33.372312 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.372294 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d548q/must-gather-cv6x2" Apr 28 20:37:33.374918 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.374899 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d548q\"/\"kube-root-ca.crt\"" Apr 28 20:37:33.375027 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.374901 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d548q\"/\"openshift-service-ca.crt\"" Apr 28 20:37:33.375027 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.374993 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-d548q\"/\"default-dockercfg-kkqgh\"" Apr 28 20:37:33.378030 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.378005 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c6bd14c6-3e0d-482e-806f-48528afa6c3b" (UID: "c6bd14c6-3e0d-482e-806f-48528afa6c3b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:37:33.378795 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.378773 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d548q/must-gather-cv6x2"] Apr 28 20:37:33.411631 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.411606 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4-must-gather-output\") pod \"must-gather-cv6x2\" (UID: \"9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4\") " pod="openshift-must-gather-d548q/must-gather-cv6x2" Apr 28 20:37:33.411750 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.411635 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkpdg\" (UniqueName: \"kubernetes.io/projected/9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4-kube-api-access-bkpdg\") pod \"must-gather-cv6x2\" (UID: \"9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4\") " pod="openshift-must-gather-d548q/must-gather-cv6x2" Apr 28 20:37:33.411750 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.411689 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-trzrw\" (UniqueName: \"kubernetes.io/projected/c6bd14c6-3e0d-482e-806f-48528afa6c3b-kube-api-access-trzrw\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:37:33.411750 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.411703 2565 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-home\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:37:33.411750 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.411717 2565 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-tmp-dir\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:37:33.411750 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.411729 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c6bd14c6-3e0d-482e-806f-48528afa6c3b-tls-certs\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:37:33.411750 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.411739 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-kserve-provision-location\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:37:33.411750 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.411747 2565 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c6bd14c6-3e0d-482e-806f-48528afa6c3b-dshm\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:37:33.512646 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.512609 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4-must-gather-output\") pod \"must-gather-cv6x2\" (UID: \"9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4\") " pod="openshift-must-gather-d548q/must-gather-cv6x2" Apr 28 20:37:33.512646 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.512645 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkpdg\" (UniqueName: \"kubernetes.io/projected/9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4-kube-api-access-bkpdg\") pod \"must-gather-cv6x2\" (UID: \"9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4\") " pod="openshift-must-gather-d548q/must-gather-cv6x2" Apr 28 20:37:33.513042 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.513002 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4-must-gather-output\") pod \"must-gather-cv6x2\" (UID: \"9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4\") " pod="openshift-must-gather-d548q/must-gather-cv6x2" Apr 28 20:37:33.520948 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.520923 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkpdg\" (UniqueName: \"kubernetes.io/projected/9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4-kube-api-access-bkpdg\") pod \"must-gather-cv6x2\" (UID: \"9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4\") " pod="openshift-must-gather-d548q/must-gather-cv6x2" Apr 28 20:37:33.581857 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.581778 2565 generic.go:358] "Generic (PLEG): container finished" podID="c6bd14c6-3e0d-482e-806f-48528afa6c3b" containerID="a3319c840715f955bb94ac1be0ab0087c50263234c522769a69ff7297f8043ae" exitCode=0 Apr 28 20:37:33.581857 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.581856 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 28 20:37:33.582046 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.581859 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"c6bd14c6-3e0d-482e-806f-48528afa6c3b","Type":"ContainerDied","Data":"a3319c840715f955bb94ac1be0ab0087c50263234c522769a69ff7297f8043ae"} Apr 28 20:37:33.582046 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.581908 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"c6bd14c6-3e0d-482e-806f-48528afa6c3b","Type":"ContainerDied","Data":"f2ee0f79c77a24b764580cb47600893fbb5b6daab9000dfb0908a0e3ad5e825c"} Apr 28 20:37:33.582046 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.581932 2565 scope.go:117] "RemoveContainer" containerID="a3319c840715f955bb94ac1be0ab0087c50263234c522769a69ff7297f8043ae" Apr 28 20:37:33.591915 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.591900 2565 scope.go:117] "RemoveContainer" containerID="8f613de411272038e3caf60f56eb4d2c1b6a11e03b85f28d051a33b9b846faa3" Apr 28 20:37:33.601906 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.601888 2565 scope.go:117] "RemoveContainer" containerID="a3319c840715f955bb94ac1be0ab0087c50263234c522769a69ff7297f8043ae" Apr 28 20:37:33.602133 ip-10-0-141-41 kubenswrapper[2565]: E0428 20:37:33.602115 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3319c840715f955bb94ac1be0ab0087c50263234c522769a69ff7297f8043ae\": container with ID starting with a3319c840715f955bb94ac1be0ab0087c50263234c522769a69ff7297f8043ae not found: ID does not exist" containerID="a3319c840715f955bb94ac1be0ab0087c50263234c522769a69ff7297f8043ae" Apr 28 20:37:33.602237 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.602141 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3319c840715f955bb94ac1be0ab0087c50263234c522769a69ff7297f8043ae"} err="failed to get container status \"a3319c840715f955bb94ac1be0ab0087c50263234c522769a69ff7297f8043ae\": rpc error: code = NotFound desc = could not find container \"a3319c840715f955bb94ac1be0ab0087c50263234c522769a69ff7297f8043ae\": container with ID starting with a3319c840715f955bb94ac1be0ab0087c50263234c522769a69ff7297f8043ae not found: ID does not exist" Apr 28 20:37:33.602237 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.602174 2565 scope.go:117] "RemoveContainer" containerID="8f613de411272038e3caf60f56eb4d2c1b6a11e03b85f28d051a33b9b846faa3" Apr 28 20:37:33.602464 ip-10-0-141-41 kubenswrapper[2565]: E0428 20:37:33.602443 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f613de411272038e3caf60f56eb4d2c1b6a11e03b85f28d051a33b9b846faa3\": container with ID starting with 8f613de411272038e3caf60f56eb4d2c1b6a11e03b85f28d051a33b9b846faa3 not found: ID does not exist" containerID="8f613de411272038e3caf60f56eb4d2c1b6a11e03b85f28d051a33b9b846faa3" Apr 28 20:37:33.602517 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.602470 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f613de411272038e3caf60f56eb4d2c1b6a11e03b85f28d051a33b9b846faa3"} err="failed to get container status \"8f613de411272038e3caf60f56eb4d2c1b6a11e03b85f28d051a33b9b846faa3\": rpc error: code = NotFound desc = could not find container \"8f613de411272038e3caf60f56eb4d2c1b6a11e03b85f28d051a33b9b846faa3\": container with ID starting with 8f613de411272038e3caf60f56eb4d2c1b6a11e03b85f28d051a33b9b846faa3 not found: ID does not exist" Apr 28 20:37:33.605484 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.605462 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 28 20:37:33.608971 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.608949 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 28 20:37:33.682433 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.682401 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d548q/must-gather-cv6x2" Apr 28 20:37:33.801433 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:33.801408 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d548q/must-gather-cv6x2"] Apr 28 20:37:33.803795 ip-10-0-141-41 kubenswrapper[2565]: W0428 20:37:33.803767 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d9dbb8e_5396_4fce_a3e8_4c5a97c24ce4.slice/crio-fdd47a2dcf726d0f60bbd7551e918c73928257fa3b9e114f3e49e7f397ce37fb WatchSource:0}: Error finding container fdd47a2dcf726d0f60bbd7551e918c73928257fa3b9e114f3e49e7f397ce37fb: Status 404 returned error can't find the container with id fdd47a2dcf726d0f60bbd7551e918c73928257fa3b9e114f3e49e7f397ce37fb Apr 28 20:37:34.112217 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:34.112180 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6bd14c6-3e0d-482e-806f-48528afa6c3b" path="/var/lib/kubelet/pods/c6bd14c6-3e0d-482e-806f-48528afa6c3b/volumes" Apr 28 20:37:34.587770 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:34.587739 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d548q/must-gather-cv6x2" event={"ID":"9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4","Type":"ContainerStarted","Data":"fdd47a2dcf726d0f60bbd7551e918c73928257fa3b9e114f3e49e7f397ce37fb"} Apr 28 20:37:39.608653 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:39.608614 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d548q/must-gather-cv6x2" event={"ID":"9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4","Type":"ContainerStarted","Data":"ad0b6eec4beb205b7684581d7ba58a76c0cce7e3351d0b2567cd4b0adbac0db8"} Apr 28 20:37:39.609049 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:39.608660 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d548q/must-gather-cv6x2" event={"ID":"9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4","Type":"ContainerStarted","Data":"fc41fe7730623dc5629181f11f560180b8a9e64d3a212cdb960547e60ec22784"} Apr 28 20:37:39.626832 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:39.626783 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d548q/must-gather-cv6x2" podStartSLOduration=1.203046781 podStartE2EDuration="6.626767104s" podCreationTimestamp="2026-04-28 20:37:33 +0000 UTC" firstStartedPulling="2026-04-28 20:37:33.805401405 +0000 UTC m=+4838.311118049" lastFinishedPulling="2026-04-28 20:37:39.229121729 +0000 UTC m=+4843.734838372" observedRunningTime="2026-04-28 20:37:39.626349714 +0000 UTC m=+4844.132066383" watchObservedRunningTime="2026-04-28 20:37:39.626767104 +0000 UTC m=+4844.132483770" Apr 28 20:37:48.314185 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:48.314076 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-s56s9_cf96fe9c-423f-4e83-9e21-3a1128dc1f55/istio-proxy/0.log" Apr 28 20:37:48.391000 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:48.390967 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-76b8b9b84-z2vkn_16877967-b3a4-4cab-bf40-3e55beed1d8c/storage-initializer/0.log" Apr 28 20:37:49.378825 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:49.378794 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-s56s9_cf96fe9c-423f-4e83-9e21-3a1128dc1f55/istio-proxy/0.log" Apr 28 20:37:49.416119 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:49.416089 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-76b8b9b84-z2vkn_16877967-b3a4-4cab-bf40-3e55beed1d8c/storage-initializer/0.log" Apr 28 20:37:50.369991 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:50.369957 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-s56s9_cf96fe9c-423f-4e83-9e21-3a1128dc1f55/istio-proxy/0.log" Apr 28 20:37:50.404423 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:50.404388 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-76b8b9b84-z2vkn_16877967-b3a4-4cab-bf40-3e55beed1d8c/storage-initializer/0.log" Apr 28 20:37:51.361410 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:51.361381 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-s56s9_cf96fe9c-423f-4e83-9e21-3a1128dc1f55/istio-proxy/0.log" Apr 28 20:37:51.394511 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:51.394486 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-76b8b9b84-z2vkn_16877967-b3a4-4cab-bf40-3e55beed1d8c/storage-initializer/0.log" Apr 28 20:37:52.334306 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:52.334276 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-s56s9_cf96fe9c-423f-4e83-9e21-3a1128dc1f55/istio-proxy/0.log" Apr 28 20:37:52.366123 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:52.366088 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-76b8b9b84-z2vkn_16877967-b3a4-4cab-bf40-3e55beed1d8c/storage-initializer/0.log" Apr 28 20:37:53.333109 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:53.333072 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-s56s9_cf96fe9c-423f-4e83-9e21-3a1128dc1f55/istio-proxy/0.log" Apr 28 20:37:53.366857 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:53.366829 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-76b8b9b84-z2vkn_16877967-b3a4-4cab-bf40-3e55beed1d8c/storage-initializer/0.log" Apr 28 20:37:54.308763 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:54.308729 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-s56s9_cf96fe9c-423f-4e83-9e21-3a1128dc1f55/istio-proxy/0.log" Apr 28 20:37:54.341528 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:54.341494 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-76b8b9b84-z2vkn_16877967-b3a4-4cab-bf40-3e55beed1d8c/storage-initializer/0.log" Apr 28 20:37:55.290566 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:55.290533 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-s56s9_cf96fe9c-423f-4e83-9e21-3a1128dc1f55/istio-proxy/0.log" Apr 28 20:37:55.325979 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:55.325955 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-76b8b9b84-z2vkn_16877967-b3a4-4cab-bf40-3e55beed1d8c/storage-initializer/0.log" Apr 28 20:37:56.269848 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:56.269817 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-s56s9_cf96fe9c-423f-4e83-9e21-3a1128dc1f55/istio-proxy/0.log" Apr 28 20:37:56.306959 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:56.306932 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-76b8b9b84-z2vkn_16877967-b3a4-4cab-bf40-3e55beed1d8c/storage-initializer/0.log" Apr 28 20:37:57.266065 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:57.266019 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-s56s9_cf96fe9c-423f-4e83-9e21-3a1128dc1f55/istio-proxy/0.log" Apr 28 20:37:57.301005 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:57.300948 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-76b8b9b84-z2vkn_16877967-b3a4-4cab-bf40-3e55beed1d8c/storage-initializer/0.log" Apr 28 20:37:58.243763 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:58.243729 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-s56s9_cf96fe9c-423f-4e83-9e21-3a1128dc1f55/istio-proxy/0.log" Apr 28 20:37:58.275779 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:58.275740 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-76b8b9b84-z2vkn_16877967-b3a4-4cab-bf40-3e55beed1d8c/storage-initializer/0.log" Apr 28 20:37:59.252247 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:59.252193 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-s56s9_cf96fe9c-423f-4e83-9e21-3a1128dc1f55/istio-proxy/0.log" Apr 28 20:37:59.289617 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:37:59.289587 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-76b8b9b84-z2vkn_16877967-b3a4-4cab-bf40-3e55beed1d8c/storage-initializer/0.log" Apr 28 20:38:00.313797 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:00.313766 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-s56s9_cf96fe9c-423f-4e83-9e21-3a1128dc1f55/istio-proxy/0.log" Apr 28 20:38:00.352631 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:00.352583 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-76b8b9b84-z2vkn_16877967-b3a4-4cab-bf40-3e55beed1d8c/storage-initializer/0.log" Apr 28 20:38:01.335466 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:01.335427 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-s56s9_cf96fe9c-423f-4e83-9e21-3a1128dc1f55/istio-proxy/0.log" Apr 28 20:38:01.389427 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:01.389387 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-76b8b9b84-z2vkn_16877967-b3a4-4cab-bf40-3e55beed1d8c/storage-initializer/0.log" Apr 28 20:38:02.415322 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:02.415291 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-d64467788-4mcrw_977bb0b7-3623-4100-ba3a-1b9d24046162/router/0.log" Apr 28 20:38:03.211044 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:03.211014 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-d64467788-4mcrw_977bb0b7-3623-4100-ba3a-1b9d24046162/router/0.log" Apr 28 20:38:04.012469 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:04.012438 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-d64467788-4mcrw_977bb0b7-3623-4100-ba3a-1b9d24046162/router/0.log" Apr 28 20:38:04.756278 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:04.756245 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-fz2s4_a607d0e0-237b-4251-b606-3b4e9e2db6c6/manager/0.log" Apr 28 20:38:04.818952 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:04.818913 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-rtl6k_442d9435-e707-4a83-8ae3-039a186a940b/manager/0.log" Apr 28 20:38:05.617126 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:05.617089 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-fz2s4_a607d0e0-237b-4251-b606-3b4e9e2db6c6/manager/0.log" Apr 28 20:38:05.687392 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:05.687361 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-rtl6k_442d9435-e707-4a83-8ae3-039a186a940b/manager/0.log" Apr 28 20:38:06.489784 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:06.489749 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-fz2s4_a607d0e0-237b-4251-b606-3b4e9e2db6c6/manager/0.log" Apr 28 20:38:06.558731 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:06.558701 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-rtl6k_442d9435-e707-4a83-8ae3-039a186a940b/manager/0.log" Apr 28 20:38:07.380129 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:07.380092 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-fz2s4_a607d0e0-237b-4251-b606-3b4e9e2db6c6/manager/0.log" Apr 28 20:38:07.450448 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:07.450419 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-rtl6k_442d9435-e707-4a83-8ae3-039a186a940b/manager/0.log" Apr 28 20:38:08.240667 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:08.240639 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-fz2s4_a607d0e0-237b-4251-b606-3b4e9e2db6c6/manager/0.log" Apr 28 20:38:08.309870 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:08.309837 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-rtl6k_442d9435-e707-4a83-8ae3-039a186a940b/manager/0.log" Apr 28 20:38:09.718337 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:09.718306 2565 generic.go:358] "Generic (PLEG): container finished" podID="9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4" containerID="fc41fe7730623dc5629181f11f560180b8a9e64d3a212cdb960547e60ec22784" exitCode=0 Apr 28 20:38:09.718758 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:09.718381 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d548q/must-gather-cv6x2" event={"ID":"9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4","Type":"ContainerDied","Data":"fc41fe7730623dc5629181f11f560180b8a9e64d3a212cdb960547e60ec22784"} Apr 28 20:38:09.718758 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:09.718725 2565 scope.go:117] "RemoveContainer" containerID="fc41fe7730623dc5629181f11f560180b8a9e64d3a212cdb960547e60ec22784" Apr 28 20:38:09.917548 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:09.917519 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d548q_must-gather-cv6x2_9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4/gather/0.log" Apr 28 20:38:13.310132 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:13.310104 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-qx6zq_2174f747-901a-4309-a1e8-74f7920485ec/global-pull-secret-syncer/0.log" Apr 28 20:38:13.500648 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:13.500609 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-x9j9g_2df21b98-cdf7-4d04-a8e9-36920fde23bd/konnectivity-agent/0.log" Apr 28 20:38:13.558567 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:13.558541 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-41.ec2.internal_ad9143c56694e3dad0c99502369b2b4a/haproxy/0.log" Apr 28 20:38:15.422252 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:15.422213 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d548q/must-gather-cv6x2"] Apr 28 20:38:15.422756 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:15.422472 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-d548q/must-gather-cv6x2" podUID="9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4" containerName="copy" containerID="cri-o://ad0b6eec4beb205b7684581d7ba58a76c0cce7e3351d0b2567cd4b0adbac0db8" gracePeriod=2 Apr 28 20:38:15.424945 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:15.424911 2565 status_manager.go:895] "Failed to get status for pod" podUID="9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4" pod="openshift-must-gather-d548q/must-gather-cv6x2" err="pods \"must-gather-cv6x2\" is forbidden: User \"system:node:ip-10-0-141-41.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-d548q\": no relationship found between node 'ip-10-0-141-41.ec2.internal' and this object" Apr 28 20:38:15.426904 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:15.426431 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d548q/must-gather-cv6x2"] Apr 28 20:38:15.653386 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:15.653366 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d548q_must-gather-cv6x2_9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4/copy/0.log" Apr 28 20:38:15.653743 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:15.653726 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d548q/must-gather-cv6x2" Apr 28 20:38:15.655892 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:15.655866 2565 status_manager.go:895] "Failed to get status for pod" podUID="9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4" pod="openshift-must-gather-d548q/must-gather-cv6x2" err="pods \"must-gather-cv6x2\" is forbidden: User \"system:node:ip-10-0-141-41.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-d548q\": no relationship found between node 'ip-10-0-141-41.ec2.internal' and this object" Apr 28 20:38:15.713839 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:15.713778 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4-must-gather-output\") pod \"9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4\" (UID: \"9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4\") " Apr 28 20:38:15.713839 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:15.713827 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkpdg\" (UniqueName: \"kubernetes.io/projected/9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4-kube-api-access-bkpdg\") pod \"9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4\" (UID: \"9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4\") " Apr 28 20:38:15.715940 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:15.715917 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4-kube-api-access-bkpdg" (OuterVolumeSpecName: "kube-api-access-bkpdg") pod "9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4" (UID: "9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4"). InnerVolumeSpecName "kube-api-access-bkpdg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:38:15.719742 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:15.719717 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4" (UID: "9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:38:15.738786 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:15.738763 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d548q_must-gather-cv6x2_9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4/copy/0.log" Apr 28 20:38:15.739098 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:15.739077 2565 generic.go:358] "Generic (PLEG): container finished" podID="9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4" containerID="ad0b6eec4beb205b7684581d7ba58a76c0cce7e3351d0b2567cd4b0adbac0db8" exitCode=143 Apr 28 20:38:15.739188 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:15.739125 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d548q/must-gather-cv6x2" Apr 28 20:38:15.739188 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:15.739149 2565 scope.go:117] "RemoveContainer" containerID="ad0b6eec4beb205b7684581d7ba58a76c0cce7e3351d0b2567cd4b0adbac0db8" Apr 28 20:38:15.741599 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:15.741572 2565 status_manager.go:895] "Failed to get status for pod" podUID="9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4" pod="openshift-must-gather-d548q/must-gather-cv6x2" err="pods \"must-gather-cv6x2\" is forbidden: User \"system:node:ip-10-0-141-41.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-d548q\": no relationship found between node 'ip-10-0-141-41.ec2.internal' and this object" Apr 28 20:38:15.746693 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:15.746676 2565 scope.go:117] "RemoveContainer" containerID="fc41fe7730623dc5629181f11f560180b8a9e64d3a212cdb960547e60ec22784" Apr 28 20:38:15.751997 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:15.751969 2565 status_manager.go:895] "Failed to get status for pod" podUID="9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4" pod="openshift-must-gather-d548q/must-gather-cv6x2" err="pods \"must-gather-cv6x2\" is forbidden: User \"system:node:ip-10-0-141-41.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-d548q\": no relationship found between node 'ip-10-0-141-41.ec2.internal' and this object" Apr 28 20:38:15.759620 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:15.759603 2565 scope.go:117] "RemoveContainer" containerID="ad0b6eec4beb205b7684581d7ba58a76c0cce7e3351d0b2567cd4b0adbac0db8" Apr 28 20:38:15.759857 ip-10-0-141-41 kubenswrapper[2565]: E0428 20:38:15.759839 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad0b6eec4beb205b7684581d7ba58a76c0cce7e3351d0b2567cd4b0adbac0db8\": container with ID starting with ad0b6eec4beb205b7684581d7ba58a76c0cce7e3351d0b2567cd4b0adbac0db8 not found: ID does not exist" containerID="ad0b6eec4beb205b7684581d7ba58a76c0cce7e3351d0b2567cd4b0adbac0db8" Apr 28 20:38:15.759898 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:15.759865 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad0b6eec4beb205b7684581d7ba58a76c0cce7e3351d0b2567cd4b0adbac0db8"} err="failed to get container status \"ad0b6eec4beb205b7684581d7ba58a76c0cce7e3351d0b2567cd4b0adbac0db8\": rpc error: code = NotFound desc = could not find container \"ad0b6eec4beb205b7684581d7ba58a76c0cce7e3351d0b2567cd4b0adbac0db8\": container with ID starting with ad0b6eec4beb205b7684581d7ba58a76c0cce7e3351d0b2567cd4b0adbac0db8 not found: ID does not exist" Apr 28 20:38:15.759898 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:15.759881 2565 scope.go:117] "RemoveContainer" containerID="fc41fe7730623dc5629181f11f560180b8a9e64d3a212cdb960547e60ec22784" Apr 28 20:38:15.760095 ip-10-0-141-41 kubenswrapper[2565]: E0428 20:38:15.760070 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc41fe7730623dc5629181f11f560180b8a9e64d3a212cdb960547e60ec22784\": container with ID starting with fc41fe7730623dc5629181f11f560180b8a9e64d3a212cdb960547e60ec22784 not found: ID does not exist" containerID="fc41fe7730623dc5629181f11f560180b8a9e64d3a212cdb960547e60ec22784" Apr 28 20:38:15.760152 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:15.760104 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc41fe7730623dc5629181f11f560180b8a9e64d3a212cdb960547e60ec22784"} err="failed to get container status \"fc41fe7730623dc5629181f11f560180b8a9e64d3a212cdb960547e60ec22784\": rpc error: code = NotFound desc = could not find container \"fc41fe7730623dc5629181f11f560180b8a9e64d3a212cdb960547e60ec22784\": container with ID starting with fc41fe7730623dc5629181f11f560180b8a9e64d3a212cdb960547e60ec22784 not found: ID does not exist" Apr 28 20:38:15.815285 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:15.815255 2565 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4-must-gather-output\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:38:15.815285 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:15.815281 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bkpdg\" (UniqueName: \"kubernetes.io/projected/9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4-kube-api-access-bkpdg\") on node \"ip-10-0-141-41.ec2.internal\" DevicePath \"\"" Apr 28 20:38:16.110463 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:16.110427 2565 status_manager.go:895] "Failed to get status for pod" podUID="9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4" pod="openshift-must-gather-d548q/must-gather-cv6x2" err="pods \"must-gather-cv6x2\" is forbidden: User \"system:node:ip-10-0-141-41.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-d548q\": no relationship found between node 'ip-10-0-141-41.ec2.internal' and this object" Apr 28 20:38:16.110714 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:16.110697 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4" path="/var/lib/kubelet/pods/9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4/volumes" Apr 28 20:38:17.332760 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:17.332725 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-fz2s4_a607d0e0-237b-4251-b606-3b4e9e2db6c6/manager/0.log" Apr 28 20:38:17.448458 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:17.448361 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-rtl6k_442d9435-e707-4a83-8ae3-039a186a940b/manager/0.log" Apr 28 20:38:18.612657 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:18.612627 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-mj7v4_9b167459-93b9-4e7b-bd66-94d693cab19e/cluster-monitoring-operator/0.log" Apr 28 20:38:18.638088 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:18.638059 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-62xq8_2ed37b5e-0c0e-4891-be01-307728d47ad3/kube-state-metrics/0.log" Apr 28 20:38:18.653785 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:18.653762 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-62xq8_2ed37b5e-0c0e-4891-be01-307728d47ad3/kube-rbac-proxy-main/0.log" Apr 28 20:38:18.669050 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:18.669022 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-62xq8_2ed37b5e-0c0e-4891-be01-307728d47ad3/kube-rbac-proxy-self/0.log" Apr 28 20:38:18.835204 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:18.835102 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dg759_35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b/node-exporter/0.log" Apr 28 20:38:18.850283 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:18.850248 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dg759_35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b/kube-rbac-proxy/0.log" Apr 28 20:38:18.868405 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:18.868381 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dg759_35d30ac3-c6ef-47c7-9d24-bb4f9f7faa0b/init-textfile/0.log" Apr 28 20:38:19.072358 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:19.072334 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f75adb1e-f78f-4176-bbb0-f18794bdf5ef/prometheus/0.log" Apr 28 20:38:19.085704 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:19.085628 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f75adb1e-f78f-4176-bbb0-f18794bdf5ef/config-reloader/0.log" Apr 28 20:38:19.102980 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:19.102957 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f75adb1e-f78f-4176-bbb0-f18794bdf5ef/thanos-sidecar/0.log" Apr 28 20:38:19.119191 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:19.119153 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f75adb1e-f78f-4176-bbb0-f18794bdf5ef/kube-rbac-proxy-web/0.log" Apr 28 20:38:19.136455 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:19.136435 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f75adb1e-f78f-4176-bbb0-f18794bdf5ef/kube-rbac-proxy/0.log" Apr 28 20:38:19.152489 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:19.152466 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f75adb1e-f78f-4176-bbb0-f18794bdf5ef/kube-rbac-proxy-thanos/0.log" Apr 28 20:38:19.169630 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:19.169605 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f75adb1e-f78f-4176-bbb0-f18794bdf5ef/init-config-reloader/0.log" Apr 28 20:38:19.195510 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:19.195489 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-kdztz_14945d18-ba40-4efc-9f03-887d459daa01/prometheus-operator/0.log" Apr 28 20:38:19.211839 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:19.211822 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-kdztz_14945d18-ba40-4efc-9f03-887d459daa01/kube-rbac-proxy/0.log" Apr 28 20:38:19.282349 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:19.282315 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-65b7d86768-x95lb_ee9aa6c0-9937-4be5-8ccb-f718ea8400e8/telemeter-client/0.log" Apr 28 20:38:19.305534 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:19.305491 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-65b7d86768-x95lb_ee9aa6c0-9937-4be5-8ccb-f718ea8400e8/reload/0.log" Apr 28 20:38:19.321578 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:19.321556 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-65b7d86768-x95lb_ee9aa6c0-9937-4be5-8ccb-f718ea8400e8/kube-rbac-proxy/0.log" Apr 28 20:38:21.766407 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:21.766379 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-8d77s_4a51790b-71af-4495-bf67-814c27aeb63e/download-server/0.log" Apr 28 20:38:22.114492 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.114419 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hjc6s/perf-node-gather-daemonset-pshcq"] Apr 28 20:38:22.114769 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.114757 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4" containerName="copy" Apr 28 20:38:22.114819 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.114770 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4" containerName="copy" Apr 28 20:38:22.114819 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.114806 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4" containerName="gather" Apr 28 20:38:22.114819 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.114811 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4" containerName="gather" Apr 28 20:38:22.114919 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.114870 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4" containerName="gather" Apr 28 20:38:22.114919 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.114880 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d9dbb8e-5396-4fce-a3e8-4c5a97c24ce4" containerName="copy" Apr 28 20:38:22.121412 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.121391 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-pshcq" Apr 28 20:38:22.123970 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.123934 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-hjc6s\"/\"default-dockercfg-nkncc\"" Apr 28 20:38:22.125244 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.125217 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hjc6s\"/\"kube-root-ca.crt\"" Apr 28 20:38:22.125360 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.125278 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hjc6s\"/\"openshift-service-ca.crt\"" Apr 28 20:38:22.127668 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.127647 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hjc6s/perf-node-gather-daemonset-pshcq"] Apr 28 20:38:22.171059 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.171029 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/eef039c9-22e9-4a61-83fc-f7c3b713efc2-podres\") pod \"perf-node-gather-daemonset-pshcq\" (UID: \"eef039c9-22e9-4a61-83fc-f7c3b713efc2\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-pshcq" Apr 28 20:38:22.171208 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.171086 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eef039c9-22e9-4a61-83fc-f7c3b713efc2-sys\") pod \"perf-node-gather-daemonset-pshcq\" (UID: \"eef039c9-22e9-4a61-83fc-f7c3b713efc2\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-pshcq" Apr 28 20:38:22.171208 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.171113 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj7ck\" (UniqueName: \"kubernetes.io/projected/eef039c9-22e9-4a61-83fc-f7c3b713efc2-kube-api-access-fj7ck\") pod \"perf-node-gather-daemonset-pshcq\" (UID: \"eef039c9-22e9-4a61-83fc-f7c3b713efc2\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-pshcq" Apr 28 20:38:22.171208 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.171180 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eef039c9-22e9-4a61-83fc-f7c3b713efc2-lib-modules\") pod \"perf-node-gather-daemonset-pshcq\" (UID: \"eef039c9-22e9-4a61-83fc-f7c3b713efc2\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-pshcq" Apr 28 20:38:22.171357 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.171240 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/eef039c9-22e9-4a61-83fc-f7c3b713efc2-proc\") pod \"perf-node-gather-daemonset-pshcq\" (UID: \"eef039c9-22e9-4a61-83fc-f7c3b713efc2\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-pshcq" Apr 28 20:38:22.256264 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.256230 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-bqqnv_8d700fe5-5b61-4638-9883-cb767568ec47/volume-data-source-validator/0.log" Apr 28 20:38:22.272380 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.272357 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eef039c9-22e9-4a61-83fc-f7c3b713efc2-sys\") pod \"perf-node-gather-daemonset-pshcq\" (UID: \"eef039c9-22e9-4a61-83fc-f7c3b713efc2\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-pshcq" Apr 28 20:38:22.272569 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.272385 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fj7ck\" (UniqueName: \"kubernetes.io/projected/eef039c9-22e9-4a61-83fc-f7c3b713efc2-kube-api-access-fj7ck\") pod \"perf-node-gather-daemonset-pshcq\" (UID: \"eef039c9-22e9-4a61-83fc-f7c3b713efc2\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-pshcq" Apr 28 20:38:22.272569 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.272420 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eef039c9-22e9-4a61-83fc-f7c3b713efc2-lib-modules\") pod \"perf-node-gather-daemonset-pshcq\" (UID: \"eef039c9-22e9-4a61-83fc-f7c3b713efc2\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-pshcq" Apr 28 20:38:22.272569 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.272454 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/eef039c9-22e9-4a61-83fc-f7c3b713efc2-proc\") pod \"perf-node-gather-daemonset-pshcq\" (UID: \"eef039c9-22e9-4a61-83fc-f7c3b713efc2\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-pshcq" Apr 28 20:38:22.272569 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.272477 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eef039c9-22e9-4a61-83fc-f7c3b713efc2-sys\") pod \"perf-node-gather-daemonset-pshcq\" (UID: \"eef039c9-22e9-4a61-83fc-f7c3b713efc2\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-pshcq" Apr 28 20:38:22.272569 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.272484 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/eef039c9-22e9-4a61-83fc-f7c3b713efc2-podres\") pod \"perf-node-gather-daemonset-pshcq\" (UID: \"eef039c9-22e9-4a61-83fc-f7c3b713efc2\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-pshcq" Apr 28 20:38:22.272758 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.272578 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/eef039c9-22e9-4a61-83fc-f7c3b713efc2-podres\") pod \"perf-node-gather-daemonset-pshcq\" (UID: \"eef039c9-22e9-4a61-83fc-f7c3b713efc2\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-pshcq" Apr 28 20:38:22.272758 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.272585 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/eef039c9-22e9-4a61-83fc-f7c3b713efc2-proc\") pod \"perf-node-gather-daemonset-pshcq\" (UID: \"eef039c9-22e9-4a61-83fc-f7c3b713efc2\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-pshcq" Apr 28 20:38:22.272758 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.272603 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eef039c9-22e9-4a61-83fc-f7c3b713efc2-lib-modules\") pod \"perf-node-gather-daemonset-pshcq\" (UID: \"eef039c9-22e9-4a61-83fc-f7c3b713efc2\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-pshcq" Apr 28 20:38:22.280175 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.280142 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj7ck\" (UniqueName: \"kubernetes.io/projected/eef039c9-22e9-4a61-83fc-f7c3b713efc2-kube-api-access-fj7ck\") pod \"perf-node-gather-daemonset-pshcq\" (UID: \"eef039c9-22e9-4a61-83fc-f7c3b713efc2\") " pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-pshcq" Apr 28 20:38:22.431826 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.431733 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-pshcq" Apr 28 20:38:22.551952 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.551910 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hjc6s/perf-node-gather-daemonset-pshcq"] Apr 28 20:38:22.554134 ip-10-0-141-41 kubenswrapper[2565]: W0428 20:38:22.554098 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podeef039c9_22e9_4a61_83fc_f7c3b713efc2.slice/crio-becd410cf0de617c91d47a51e036014320d98d126f97f998b52be14ea95ba7ac WatchSource:0}: Error finding container becd410cf0de617c91d47a51e036014320d98d126f97f998b52be14ea95ba7ac: Status 404 returned error can't find the container with id becd410cf0de617c91d47a51e036014320d98d126f97f998b52be14ea95ba7ac Apr 28 20:38:22.764080 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.764044 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-pshcq" event={"ID":"eef039c9-22e9-4a61-83fc-f7c3b713efc2","Type":"ContainerStarted","Data":"76e199f6534edc421f463ded3c1f3799f0ec266d1cd7ac02ab5594df98095a4a"} Apr 28 20:38:22.764080 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.764082 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-pshcq" event={"ID":"eef039c9-22e9-4a61-83fc-f7c3b713efc2","Type":"ContainerStarted","Data":"becd410cf0de617c91d47a51e036014320d98d126f97f998b52be14ea95ba7ac"} Apr 28 20:38:22.764313 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.764102 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-pshcq" Apr 28 20:38:22.780769 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.780722 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-pshcq" podStartSLOduration=0.780708879 podStartE2EDuration="780.708879ms" podCreationTimestamp="2026-04-28 20:38:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:38:22.779764741 +0000 UTC m=+4887.285481407" watchObservedRunningTime="2026-04-28 20:38:22.780708879 +0000 UTC m=+4887.286425544" Apr 28 20:38:22.951587 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.951553 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2jwqs_4d5f310c-a755-4af1-8570-335ac92bb8cf/dns/0.log" Apr 28 20:38:22.969275 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:22.969251 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2jwqs_4d5f310c-a755-4af1-8570-335ac92bb8cf/kube-rbac-proxy/0.log" Apr 28 20:38:23.071050 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:23.070974 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-gg56k_5b0e3ab7-8ea3-4aea-8300-4e7ecf70c550/dns-node-resolver/0.log" Apr 28 20:38:23.576305 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:23.576261 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ttwth_784cb0c1-1c08-41f8-8c08-e92cdf0c70ce/node-ca/0.log" Apr 28 20:38:24.416212 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:24.416134 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-d64467788-4mcrw_977bb0b7-3623-4100-ba3a-1b9d24046162/router/0.log" Apr 28 20:38:24.854005 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:24.853975 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-w7t9f_4f0dd845-b66a-4d78-b7a3-811ca24028e4/serve-healthcheck-canary/0.log" Apr 28 20:38:25.284941 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:25.284912 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9zfhp_7b91d51c-a9f3-41ac-8b7e-09e04af9b26a/kube-rbac-proxy/0.log" Apr 28 20:38:25.300970 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:25.300943 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9zfhp_7b91d51c-a9f3-41ac-8b7e-09e04af9b26a/exporter/0.log" Apr 28 20:38:25.316852 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:25.316826 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9zfhp_7b91d51c-a9f3-41ac-8b7e-09e04af9b26a/extractor/0.log" Apr 28 20:38:27.904716 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:27.904673 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-79cf4cb497-8sztj_e928964b-a243-4db2-8b96-8c4ca7e022f8/manager/0.log" Apr 28 20:38:28.427040 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:28.426983 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-b85c69797-wpbd6_36820a29-0121-440b-9c92-de28da74677f/manager/0.log" Apr 28 20:38:28.777525 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:28.777497 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-hjc6s/perf-node-gather-daemonset-pshcq" Apr 28 20:38:33.097480 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:33.097448 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-ljnqn_d88080ac-e246-4a18-88af-b696d1f2fc08/migrator/0.log" Apr 28 20:38:33.112301 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:33.112272 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-ljnqn_d88080ac-e246-4a18-88af-b696d1f2fc08/graceful-termination/0.log" Apr 28 20:38:34.384770 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:34.384741 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-64s26_3b36aafe-8438-4826-93f5-10d39349a4f7/kube-multus/0.log" Apr 28 20:38:34.410852 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:34.410825 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ll4ff_2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6/kube-multus-additional-cni-plugins/0.log" Apr 28 20:38:34.427192 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:34.427149 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ll4ff_2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6/egress-router-binary-copy/0.log" Apr 28 20:38:34.441944 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:34.441921 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ll4ff_2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6/cni-plugins/0.log" Apr 28 20:38:34.458020 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:34.457998 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ll4ff_2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6/bond-cni-plugin/0.log" Apr 28 20:38:34.473870 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:34.473848 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ll4ff_2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6/routeoverride-cni/0.log" Apr 28 20:38:34.488463 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:34.488438 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ll4ff_2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6/whereabouts-cni-bincopy/0.log" Apr 28 20:38:34.505325 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:34.505278 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ll4ff_2e4607bc-ad98-4ac7-a34e-0cf93fb04ba6/whereabouts-cni/0.log" Apr 28 20:38:34.921896 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:34.921866 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hndjc_cbe36bec-c099-4625-b8c9-eb67c281b442/network-metrics-daemon/0.log" Apr 28 20:38:34.938085 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:34.938058 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hndjc_cbe36bec-c099-4625-b8c9-eb67c281b442/kube-rbac-proxy/0.log" Apr 28 20:38:35.702231 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:35.702192 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqj2n_46f624dd-71ff-4136-b6dd-c90053e2799c/ovn-controller/0.log" Apr 28 20:38:35.757276 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:35.757245 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqj2n_46f624dd-71ff-4136-b6dd-c90053e2799c/ovn-acl-logging/0.log" Apr 28 20:38:35.775391 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:35.775366 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqj2n_46f624dd-71ff-4136-b6dd-c90053e2799c/kube-rbac-proxy-node/0.log" Apr 28 20:38:35.793081 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:35.793059 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqj2n_46f624dd-71ff-4136-b6dd-c90053e2799c/kube-rbac-proxy-ovn-metrics/0.log" Apr 28 20:38:35.806643 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:35.806594 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqj2n_46f624dd-71ff-4136-b6dd-c90053e2799c/northd/0.log" Apr 28 20:38:35.820874 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:35.820851 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqj2n_46f624dd-71ff-4136-b6dd-c90053e2799c/nbdb/0.log" Apr 28 20:38:35.835684 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:35.835662 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqj2n_46f624dd-71ff-4136-b6dd-c90053e2799c/sbdb/0.log" Apr 28 20:38:35.998385 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:35.998305 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqj2n_46f624dd-71ff-4136-b6dd-c90053e2799c/ovnkube-controller/0.log" Apr 28 20:38:37.597281 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:37.597254 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-vffwj_3daf2574-ff6f-4e8a-b1ed-11bf807c7403/check-endpoints/0.log" Apr 28 20:38:37.615506 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:37.615477 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-j22v7_f9d394e5-59b9-48cb-b465-c8476cbd89d1/network-check-target-container/0.log" Apr 28 20:38:38.593428 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:38.593400 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-hpgl5_2e96850e-1476-4be5-9535-99fe12d6740c/iptables-alerter/0.log" Apr 28 20:38:39.287615 ip-10-0-141-41 kubenswrapper[2565]: I0428 20:38:39.287584 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-lsmn8_ce296200-b7fe-4c18-8f04-0fcf672d1613/tuned/0.log"