Apr 24 22:29:52.736288 ip-10-0-137-103 systemd[1]: Starting Kubernetes Kubelet... Apr 24 22:29:53.176181 ip-10-0-137-103 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:29:53.176181 ip-10-0-137-103 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 22:29:53.176181 ip-10-0-137-103 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:29:53.176181 ip-10-0-137-103 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 22:29:53.176181 ip-10-0-137-103 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:29:53.177575 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.177510 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 22:29:53.180413 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180396 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:53.180413 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180411 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:53.180413 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180416 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:53.180600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180420 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:53.180600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180425 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:53.180600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180429 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:53.180600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180433 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:53.180600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180438 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:53.180600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180442 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:53.180600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180448 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:53.180600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180454 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:53.180600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180458 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:53.180600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180463 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:53.180600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180468 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:53.180600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180485 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:53.180600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180490 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:53.180600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180494 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:53.180600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180498 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:53.180600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180502 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:53.180600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180506 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:53.180600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180509 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:53.180600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180513 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:53.181401 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180517 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:53.181401 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180522 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:53.181401 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180526 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:53.181401 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180529 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:53.181401 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180534 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:53.181401 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180538 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:53.181401 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180542 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:53.181401 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180546 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:53.181401 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180550 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:53.181401 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180555 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:53.181401 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180559 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:53.181401 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180564 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:53.181401 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180568 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:53.181401 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180572 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:53.181401 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180578 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:53.181401 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180582 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:53.181401 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180587 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:53.181401 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180591 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:53.181401 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180595 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:53.181401 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180599 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:53.182108 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180603 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:53.182108 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180607 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:53.182108 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180611 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:53.182108 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180615 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:53.182108 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180620 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:53.182108 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180632 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:53.182108 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180637 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:53.182108 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180642 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:53.182108 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180646 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:53.182108 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180650 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:53.182108 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180655 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:53.182108 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180659 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:53.182108 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180663 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:53.182108 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180668 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:53.182108 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180672 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:53.182108 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180676 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:53.182108 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180680 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:53.182108 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180684 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:53.182108 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180688 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:53.182108 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180692 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:53.182600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180697 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:53.182600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180701 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:53.182600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180705 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:53.182600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180710 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:53.182600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180714 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:53.182600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180719 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:53.182600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180724 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:53.182600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180729 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:53.182600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180736 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:53.182600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180740 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:53.182600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180743 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:53.182600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180748 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:53.182600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180752 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:53.182600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180756 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:53.182600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180760 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:53.182600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180764 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:53.182600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180768 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:53.182600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180772 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:53.182600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180776 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:53.182600 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180780 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:53.183411 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180784 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:53.183411 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180788 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:53.183411 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180793 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:53.183411 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.180797 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:53.183411 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181373 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:53.183411 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181381 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:53.183411 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181386 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:53.183411 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181390 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:53.183411 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181395 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:53.183411 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181399 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:53.183411 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181403 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:53.183411 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181407 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:53.183411 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181411 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:53.183411 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181415 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:53.183411 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181419 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:53.183411 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181425 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:53.183411 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181429 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:53.183411 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181433 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:53.183411 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181438 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:53.183411 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181443 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:53.183961 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181448 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:53.183961 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181452 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:53.183961 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181456 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:53.183961 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181461 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:53.183961 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181465 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:53.183961 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181469 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:53.183961 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181473 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:53.183961 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181477 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:53.183961 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181481 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:53.183961 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181485 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:53.183961 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181490 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:53.183961 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181494 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:53.183961 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181498 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:53.183961 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181502 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:53.183961 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181506 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:53.183961 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181510 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:53.183961 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181514 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:53.183961 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181518 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:53.183961 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181522 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:53.183961 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181526 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:53.184516 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181530 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:53.184516 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181535 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:53.184516 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181539 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:53.184516 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181544 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:53.184516 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181549 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:53.184516 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181553 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:53.184516 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181557 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:53.184516 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181561 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:53.184516 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181566 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:53.184516 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181570 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:53.184516 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181574 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:53.184516 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181578 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:53.184516 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181583 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:53.184516 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181587 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:53.184516 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181591 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:53.184516 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181595 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:53.184516 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181599 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:53.184516 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181603 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:53.184516 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181607 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:53.184516 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181612 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:53.184994 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181616 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:53.184994 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181620 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:53.184994 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181624 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:53.184994 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181628 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:53.184994 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181633 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:53.184994 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181637 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:53.184994 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181641 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:53.184994 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181645 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:53.184994 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181649 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:53.184994 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181653 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:53.184994 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181658 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:53.184994 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181662 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:53.184994 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181666 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:53.184994 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181671 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:53.184994 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181675 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:53.184994 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181682 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:53.184994 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181688 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:53.184994 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181693 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:53.184994 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181701 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:53.185471 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181708 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:53.185471 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181714 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:53.185471 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181719 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:53.185471 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181724 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:53.185471 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181728 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:53.185471 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181733 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:53.185471 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181737 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:53.185471 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181742 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:53.185471 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181746 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:53.185471 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181750 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:53.185471 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.181753 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:53.185471 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182669 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 22:29:53.185471 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182684 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 22:29:53.185471 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182696 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 22:29:53.185471 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182703 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 22:29:53.185471 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182709 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 22:29:53.185471 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182714 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 22:29:53.185471 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182723 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 22:29:53.185471 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182730 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 22:29:53.185471 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182736 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 22:29:53.185471 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182741 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 22:29:53.185982 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182746 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 22:29:53.185982 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182751 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 22:29:53.185982 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182756 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 22:29:53.185982 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182761 2572 flags.go:64] FLAG: --cgroup-root="" Apr 24 22:29:53.185982 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182766 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 22:29:53.185982 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182771 2572 flags.go:64] FLAG: --client-ca-file="" Apr 24 22:29:53.185982 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182775 2572 flags.go:64] FLAG: --cloud-config="" Apr 24 22:29:53.185982 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182780 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 24 22:29:53.185982 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182785 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 22:29:53.185982 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182791 2572 flags.go:64] FLAG: --cluster-domain="" Apr 24 22:29:53.185982 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182797 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 22:29:53.185982 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182802 2572 flags.go:64] FLAG: --config-dir="" Apr 24 22:29:53.185982 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182806 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 22:29:53.185982 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182812 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 22:29:53.185982 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182818 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 22:29:53.185982 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182823 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 22:29:53.185982 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182828 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 22:29:53.185982 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182833 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 22:29:53.185982 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182839 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 24 22:29:53.185982 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182844 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 22:29:53.185982 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182849 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 22:29:53.185982 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182854 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 22:29:53.185982 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182859 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 22:29:53.185982 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182865 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 22:29:53.185982 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182870 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 22:29:53.186614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182875 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 22:29:53.186614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182879 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 22:29:53.186614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182884 2572 flags.go:64] FLAG: --enable-server="true" Apr 24 22:29:53.186614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182891 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 22:29:53.186614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182898 2572 flags.go:64] FLAG: --event-burst="100" Apr 24 22:29:53.186614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182903 2572 flags.go:64] FLAG: --event-qps="50" Apr 24 22:29:53.186614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182908 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 22:29:53.186614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182913 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 22:29:53.186614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182918 2572 flags.go:64] FLAG: --eviction-hard="" Apr 24 22:29:53.186614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182923 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 22:29:53.186614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182928 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 22:29:53.186614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182933 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 22:29:53.186614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182938 2572 flags.go:64] FLAG: --eviction-soft="" Apr 24 22:29:53.186614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182943 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 22:29:53.186614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182947 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 22:29:53.186614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182952 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 22:29:53.186614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182956 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 22:29:53.186614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182963 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 22:29:53.186614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182968 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 22:29:53.186614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182972 2572 flags.go:64] FLAG: --feature-gates="" Apr 24 22:29:53.186614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182979 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 22:29:53.186614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182983 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 22:29:53.186614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182988 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 22:29:53.186614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182994 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 22:29:53.186614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.182998 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 24 22:29:53.186614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183003 2572 flags.go:64] FLAG: --help="false" Apr 24 22:29:53.187262 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183022 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-137-103.ec2.internal" Apr 24 22:29:53.187262 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183029 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 22:29:53.187262 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183033 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 22:29:53.187262 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183038 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 22:29:53.187262 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183044 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 22:29:53.187262 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183049 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 22:29:53.187262 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183054 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 22:29:53.187262 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183058 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 22:29:53.187262 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183063 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 22:29:53.187262 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183072 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 22:29:53.187262 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183078 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 22:29:53.187262 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183083 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 22:29:53.187262 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183088 2572 flags.go:64] FLAG: --kube-reserved="" Apr 24 22:29:53.187262 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183093 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 22:29:53.187262 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183097 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 22:29:53.187262 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183102 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 22:29:53.187262 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183107 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 22:29:53.187262 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183111 2572 flags.go:64] FLAG: --lock-file="" Apr 24 22:29:53.187262 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183116 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 22:29:53.187262 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183121 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 22:29:53.187262 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183126 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 22:29:53.187262 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183134 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 22:29:53.187262 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183139 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 22:29:53.187807 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183145 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 22:29:53.187807 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183150 2572 flags.go:64] FLAG: --logging-format="text" Apr 24 22:29:53.187807 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183154 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 22:29:53.187807 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183159 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 22:29:53.187807 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183164 2572 flags.go:64] FLAG: --manifest-url="" Apr 24 22:29:53.187807 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183169 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 24 22:29:53.187807 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183175 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 22:29:53.187807 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183180 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 22:29:53.187807 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183186 2572 flags.go:64] FLAG: --max-pods="110" Apr 24 22:29:53.187807 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183191 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 22:29:53.187807 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183196 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 22:29:53.187807 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183201 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 22:29:53.187807 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183206 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 22:29:53.187807 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183211 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 22:29:53.187807 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183216 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 22:29:53.187807 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183220 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 22:29:53.187807 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183231 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 22:29:53.187807 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183236 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 22:29:53.187807 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183242 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 22:29:53.187807 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183247 2572 flags.go:64] FLAG: --pod-cidr="" Apr 24 22:29:53.187807 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183252 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 22:29:53.187807 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183261 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 22:29:53.187807 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183266 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 22:29:53.187807 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183271 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 24 22:29:53.188398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183275 2572 flags.go:64] FLAG: --port="10250" Apr 24 22:29:53.188398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183280 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 22:29:53.188398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183285 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-080cbef7b1c33434f" Apr 24 22:29:53.188398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183290 2572 flags.go:64] FLAG: --qos-reserved="" Apr 24 22:29:53.188398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183295 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 24 22:29:53.188398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183300 2572 flags.go:64] FLAG: --register-node="true" Apr 24 22:29:53.188398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183305 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 24 22:29:53.188398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183309 2572 flags.go:64] FLAG: --register-with-taints="" Apr 24 22:29:53.188398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183322 2572 flags.go:64] FLAG: --registry-burst="10" Apr 24 22:29:53.188398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183329 2572 flags.go:64] FLAG: --registry-qps="5" Apr 24 22:29:53.188398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183333 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 24 22:29:53.188398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183338 2572 flags.go:64] FLAG: --reserved-memory="" Apr 24 22:29:53.188398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183365 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 22:29:53.188398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183385 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 22:29:53.188398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183390 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 22:29:53.188398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183394 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 22:29:53.188398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183398 2572 flags.go:64] FLAG: --runonce="false" Apr 24 22:29:53.188398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183402 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 22:29:53.188398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183405 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 22:29:53.188398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183409 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 24 22:29:53.188398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183412 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 22:29:53.188398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183416 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 22:29:53.188398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183419 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 22:29:53.188398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183423 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 22:29:53.188398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183426 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 22:29:53.188398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183429 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 22:29:53.189064 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183432 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 22:29:53.189064 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183435 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 22:29:53.189064 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183438 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 22:29:53.189064 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183442 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 22:29:53.189064 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183445 2572 flags.go:64] FLAG: --system-cgroups="" Apr 24 22:29:53.189064 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183448 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 22:29:53.189064 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183456 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 22:29:53.189064 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183459 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 24 22:29:53.189064 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183462 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 22:29:53.189064 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183469 2572 flags.go:64] FLAG: --tls-min-version="" Apr 24 22:29:53.189064 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183472 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 22:29:53.189064 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183475 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 22:29:53.189064 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183478 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 22:29:53.189064 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183481 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 22:29:53.189064 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183484 2572 flags.go:64] FLAG: --v="2" Apr 24 22:29:53.189064 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183489 2572 flags.go:64] FLAG: --version="false" Apr 24 22:29:53.189064 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183494 2572 flags.go:64] FLAG: --vmodule="" Apr 24 22:29:53.189064 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183498 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 22:29:53.189064 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.183501 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 22:29:53.189064 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183592 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:53.189064 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183596 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:53.189064 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183599 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:53.189064 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183602 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:53.189064 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183605 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:53.189659 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183608 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:53.189659 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183611 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:53.189659 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183613 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:53.189659 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183617 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:53.189659 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183619 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:53.189659 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183622 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:53.189659 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183624 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:53.189659 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183627 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:53.189659 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183629 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:53.189659 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183632 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:53.189659 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183634 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:53.189659 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183637 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:53.189659 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183641 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:53.189659 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183645 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:53.189659 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183648 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:53.189659 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183651 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:53.189659 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183654 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:53.189659 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183657 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:53.189659 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183660 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:53.190156 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183662 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:53.190156 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183665 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:53.190156 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183667 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:53.190156 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183670 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:53.190156 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183672 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:53.190156 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183675 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:53.190156 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183677 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:53.190156 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183680 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:53.190156 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183684 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:53.190156 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183686 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:53.190156 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183689 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:53.190156 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183692 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:53.190156 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183694 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:53.190156 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183697 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:53.190156 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183700 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:53.190156 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183702 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:53.190156 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183705 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:53.190156 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183708 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:53.190156 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183710 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:53.190156 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183713 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:53.190668 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183715 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:53.190668 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183718 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:53.190668 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183721 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:53.190668 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183724 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:53.190668 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183727 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:53.190668 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183729 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:53.190668 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183732 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:53.190668 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183734 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:53.190668 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183737 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:53.190668 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183740 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:53.190668 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183742 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:53.190668 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183745 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:53.190668 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183748 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:53.190668 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183750 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:53.190668 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183753 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:53.190668 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183756 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:53.190668 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183758 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:53.190668 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183761 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:53.190668 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183763 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:53.190668 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183766 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:53.191173 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183769 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:53.191173 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183771 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:53.191173 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183774 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:53.191173 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183776 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:53.191173 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183779 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:53.191173 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183781 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:53.191173 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183784 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:53.191173 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183786 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:53.191173 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183789 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:53.191173 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183791 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:53.191173 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183794 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:53.191173 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183796 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:53.191173 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183799 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:53.191173 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183801 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:53.191173 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183805 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:53.191173 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183809 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:53.191173 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183812 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:53.191173 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183814 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:53.191173 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183817 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:53.191662 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183820 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:53.191662 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183822 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:53.191662 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.183825 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:53.191662 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.184668 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:29:53.196766 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.196745 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 22:29:53.196766 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.196765 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 22:29:53.196867 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196811 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:53.196867 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196817 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:53.196867 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196820 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:53.196867 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196823 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:53.196867 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196826 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:53.196867 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196829 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:53.196867 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196831 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:53.196867 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196835 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:53.196867 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196837 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:53.196867 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196840 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:53.196867 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196842 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:53.196867 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196845 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:53.196867 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196848 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:53.196867 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196850 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:53.196867 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196853 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:53.196867 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196855 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:53.196867 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196858 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:53.196867 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196860 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:53.196867 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196863 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:53.196867 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196866 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:53.197373 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196868 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:53.197373 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196871 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:53.197373 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196874 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:53.197373 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196877 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:53.197373 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196880 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:53.197373 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196882 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:53.197373 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196885 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:53.197373 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196888 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:53.197373 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196890 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:53.197373 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196893 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:53.197373 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196896 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:53.197373 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196898 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:53.197373 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196901 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:53.197373 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196904 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:53.197373 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196906 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:53.197373 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196910 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:53.197373 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196914 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:53.197373 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196917 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:53.197373 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196920 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:53.197373 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196922 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:53.197891 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196925 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:53.197891 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196927 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:53.197891 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196930 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:53.197891 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196932 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:53.197891 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196935 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:53.197891 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196937 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:53.197891 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196940 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:53.197891 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196942 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:53.197891 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196945 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:53.197891 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196947 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:53.197891 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196950 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:53.197891 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196952 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:53.197891 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196955 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:53.197891 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196959 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:53.197891 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196961 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:53.197891 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196964 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:53.197891 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196967 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:53.197891 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196970 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:53.197891 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196972 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:53.197891 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196975 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:53.198467 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196977 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:53.198467 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196980 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:53.198467 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196983 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:53.198467 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196985 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:53.198467 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196988 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:53.198467 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196990 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:53.198467 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196993 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:53.198467 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196995 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:53.198467 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.196998 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:53.198467 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197000 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:53.198467 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197002 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:53.198467 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197005 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:53.198467 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197007 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:53.198467 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197025 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:53.198467 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197028 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:53.198467 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197030 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:53.198467 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197033 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:53.198467 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197035 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:53.198467 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197039 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:53.198927 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197042 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:53.198927 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197045 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:53.198927 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197047 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:53.198927 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197051 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:53.198927 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197055 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:53.198927 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197057 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:53.198927 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197061 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:53.198927 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.197066 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:29:53.198927 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197154 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:53.198927 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197158 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:53.198927 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197161 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:53.198927 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197164 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:53.198927 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197167 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:53.198927 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197170 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:53.198927 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197172 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:53.199329 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197175 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:53.199329 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197178 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:53.199329 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197180 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:53.199329 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197183 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:53.199329 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197186 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:53.199329 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197188 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:53.199329 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197191 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:53.199329 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197193 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:53.199329 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197196 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:53.199329 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197198 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:53.199329 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197201 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:53.199329 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197203 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:53.199329 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197206 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:53.199329 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197208 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:53.199329 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197211 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:53.199329 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197214 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:53.199329 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197217 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:53.199329 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197219 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:53.199329 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197222 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:53.199777 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197224 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:53.199777 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197227 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:53.199777 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197230 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:53.199777 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197232 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:53.199777 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197235 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:53.199777 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197238 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:53.199777 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197240 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:53.199777 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197243 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:53.199777 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197245 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:53.199777 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197248 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:53.199777 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197250 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:53.199777 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197253 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:53.199777 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197255 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:53.199777 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197258 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:53.199777 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197260 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:53.199777 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197263 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:53.199777 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197265 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:53.199777 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197269 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:53.199777 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197272 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:53.200249 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197275 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:53.200249 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197278 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:53.200249 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197281 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:53.200249 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197284 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:53.200249 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197286 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:53.200249 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197289 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:53.200249 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197292 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:53.200249 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197294 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:53.200249 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197297 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:53.200249 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197299 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:53.200249 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197302 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:53.200249 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197305 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:53.200249 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197307 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:53.200249 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197310 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:53.200249 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197313 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:53.200249 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197317 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:53.200249 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197319 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:53.200249 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197322 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:53.200249 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197324 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:53.200249 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197327 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:53.200739 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197329 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:53.200739 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197331 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:53.200739 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197334 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:53.200739 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197336 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:53.200739 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197339 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:53.200739 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197341 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:53.200739 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197344 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:53.200739 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197346 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:53.200739 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197349 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:53.200739 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197351 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:53.200739 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197354 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:53.200739 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197356 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:53.200739 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197358 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:53.200739 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197361 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:53.200739 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197363 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:53.200739 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197366 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:53.200739 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197368 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:53.200739 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197370 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:53.200739 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197373 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:53.200739 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197375 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:53.201231 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:53.197378 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:53.201231 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.197383 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:29:53.201231 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.198043 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 22:29:53.201231 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.201097 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 22:29:53.202100 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.202085 2572 server.go:1019] "Starting client certificate rotation" Apr 24 22:29:53.202194 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.202178 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 22:29:53.202226 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.202219 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 22:29:53.228230 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.228215 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 22:29:53.232520 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.232494 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 22:29:53.242023 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.241998 2572 log.go:25] "Validated CRI v1 runtime API" Apr 24 22:29:53.247906 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.247889 2572 log.go:25] "Validated CRI v1 image API" Apr 24 22:29:53.249242 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.249228 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 22:29:53.253289 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.253267 2572 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 dbed5bd7-1a44-4cd1-9f88-f7854d86a151:/dev/nvme0n1p4 f5085138-765b-46c1-b915-883f382dc72a:/dev/nvme0n1p3] Apr 24 22:29:53.253350 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.253289 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 22:29:53.260426 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.260324 2572 manager.go:217] Machine: {Timestamp:2026-04-24 22:29:53.258450237 +0000 UTC m=+0.406755776 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3097348 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a53d294a2d70f1a58b16dcc1c03c7 SystemUUID:ec2a53d2-94a2-d70f-1a58-b16dcc1c03c7 BootID:b3388d60-052e-4b29-896f-5d1c2326ae46 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:16:e5:11:cb:23 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:16:e5:11:cb:23 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ca:2e:10:4c:dd:1a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 22:29:53.260426 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.260416 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 22:29:53.260555 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.260507 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 22:29:53.261625 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.261604 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 22:29:53.261819 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.261629 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-103.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 22:29:53.261866 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.261833 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 22:29:53.261866 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.261846 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 22:29:53.261926 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.261887 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 22:29:53.261926 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.261885 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 22:29:53.262660 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.262647 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 22:29:53.264090 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.264079 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 24 22:29:53.264196 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.264187 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 22:29:53.266796 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.266786 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 24 22:29:53.266833 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.266799 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 22:29:53.266833 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.266810 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 22:29:53.266833 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.266818 2572 kubelet.go:397] "Adding apiserver pod source" Apr 24 22:29:53.266833 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.266827 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 22:29:53.267851 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.267839 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 22:29:53.267895 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.267858 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 22:29:53.270747 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.270720 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 22:29:53.272287 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.272272 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 22:29:53.274227 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.274213 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 22:29:53.274271 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.274236 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 22:29:53.274271 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.274249 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 22:29:53.274271 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.274258 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 22:29:53.274271 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.274266 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 22:29:53.274389 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.274274 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 22:29:53.274389 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.274283 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 22:29:53.274389 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.274291 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 22:29:53.274389 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.274301 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 22:29:53.274389 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.274309 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 22:29:53.274389 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.274326 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 22:29:53.274389 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.274340 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 22:29:53.276024 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.275999 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 22:29:53.276117 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.276025 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 22:29:53.279032 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:53.278996 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-103.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 22:29:53.279095 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:53.279029 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 22:29:53.279582 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.279571 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 22:29:53.279625 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.279607 2572 server.go:1295] "Started kubelet" Apr 24 22:29:53.279704 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.279684 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 22:29:53.279825 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.279764 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 22:29:53.279905 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.279850 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 22:29:53.280243 ip-10-0-137-103 systemd[1]: Started Kubernetes Kubelet. Apr 24 22:29:53.281085 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.280907 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 22:29:53.283290 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.283182 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 24 22:29:53.286214 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.286192 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 22:29:53.286214 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.286204 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 22:29:53.286957 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.286940 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 22:29:53.287069 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:53.287001 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-103.ec2.internal\" not found" Apr 24 22:29:53.287142 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.287118 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 22:29:53.287142 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.287130 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 22:29:53.287244 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.287174 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 22:29:53.287244 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.287190 2572 factory.go:55] Registering systemd factory Apr 24 22:29:53.287244 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.287200 2572 factory.go:223] Registration of the systemd container factory successfully Apr 24 22:29:53.287244 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.287236 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 24 22:29:53.287244 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.287247 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 24 22:29:53.290354 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.290324 2572 factory.go:153] Registering CRI-O factory Apr 24 22:29:53.290354 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.290343 2572 factory.go:223] Registration of the crio container factory successfully Apr 24 22:29:53.290506 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.290377 2572 factory.go:103] Registering Raw factory Apr 24 22:29:53.290506 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.290390 2572 manager.go:1196] Started watching for new ooms in manager Apr 24 22:29:53.290835 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.290817 2572 manager.go:319] Starting recovery of all containers Apr 24 22:29:53.294517 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:53.294479 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 22:29:53.294717 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.294682 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-103.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 22:29:53.295077 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:53.295055 2572 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-137-103.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 22:29:53.297124 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:53.297101 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 22:29:53.300080 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:53.294753 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-103.ec2.internal.18a96b8fc981a032 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-103.ec2.internal,UID:ip-10-0-137-103.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-103.ec2.internal,},FirstTimestamp:2026-04-24 22:29:53.279582258 +0000 UTC m=+0.427887798,LastTimestamp:2026-04-24 22:29:53.279582258 +0000 UTC m=+0.427887798,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-103.ec2.internal,}" Apr 24 22:29:53.306078 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.306063 2572 manager.go:324] Recovery completed Apr 24 22:29:53.309870 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.309857 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:53.312145 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.312129 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-103.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:53.312211 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.312157 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:53.312211 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.312169 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-103.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:53.312575 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.312561 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 22:29:53.312575 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.312573 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 22:29:53.312653 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.312590 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 24 22:29:53.314347 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:53.314287 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-103.ec2.internal.18a96b8fcb727c5b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-103.ec2.internal,UID:ip-10-0-137-103.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-137-103.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-137-103.ec2.internal,},FirstTimestamp:2026-04-24 22:29:53.312144475 +0000 UTC m=+0.460450014,LastTimestamp:2026-04-24 22:29:53.312144475 +0000 UTC m=+0.460450014,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-103.ec2.internal,}" Apr 24 22:29:53.314695 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.314683 2572 policy_none.go:49] "None policy: Start" Apr 24 22:29:53.314762 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.314699 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 22:29:53.314762 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.314709 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 24 22:29:53.323301 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:53.323244 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-103.ec2.internal.18a96b8fcb72c95a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-103.ec2.internal,UID:ip-10-0-137-103.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-137-103.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-137-103.ec2.internal,},FirstTimestamp:2026-04-24 22:29:53.312164186 +0000 UTC m=+0.460469724,LastTimestamp:2026-04-24 22:29:53.312164186 +0000 UTC m=+0.460469724,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-103.ec2.internal,}" Apr 24 22:29:53.337838 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:53.337765 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-103.ec2.internal.18a96b8fcb72efac default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-103.ec2.internal,UID:ip-10-0-137-103.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-137-103.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-137-103.ec2.internal,},FirstTimestamp:2026-04-24 22:29:53.312173996 +0000 UTC m=+0.460479534,LastTimestamp:2026-04-24 22:29:53.312173996 +0000 UTC m=+0.460479534,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-103.ec2.internal,}" Apr 24 22:29:53.355517 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.355506 2572 manager.go:341] "Starting Device Plugin manager" Apr 24 22:29:53.355591 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:53.355569 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 22:29:53.355591 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.355579 2572 server.go:85] "Starting device plugin registration server" Apr 24 22:29:53.355786 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.355775 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 22:29:53.355834 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.355789 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 22:29:53.356209 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.355892 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 22:29:53.356209 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.355972 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 22:29:53.356209 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.355979 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 22:29:53.356482 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:53.356442 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 22:29:53.356482 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:53.356476 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-103.ec2.internal\" not found" Apr 24 22:29:53.368158 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:53.368103 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-103.ec2.internal.18a96b8fce2ce9a5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-103.ec2.internal,UID:ip-10-0-137-103.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:ip-10-0-137-103.ec2.internal,},FirstTimestamp:2026-04-24 22:29:53.357916581 +0000 UTC m=+0.506222111,LastTimestamp:2026-04-24 22:29:53.357916581 +0000 UTC m=+0.506222111,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-103.ec2.internal,}" Apr 24 22:29:53.411833 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.411808 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 22:29:53.412888 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.412869 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 22:29:53.412957 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.412892 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 22:29:53.412957 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.412907 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 22:29:53.412957 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.412914 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 22:29:53.412957 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:53.412944 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 22:29:53.423811 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.423796 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xcsxk" Apr 24 22:29:53.428202 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:53.428148 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 24 22:29:53.434471 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.434458 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xcsxk" Apr 24 22:29:53.456581 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.456557 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:53.457256 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.457242 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-103.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:53.457329 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.457264 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:53.457329 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.457276 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-103.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:53.457329 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.457316 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-103.ec2.internal" Apr 24 22:29:53.473550 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.473531 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-103.ec2.internal" Apr 24 22:29:53.473633 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:53.473553 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-103.ec2.internal\": node \"ip-10-0-137-103.ec2.internal\" not found" Apr 24 22:29:53.510840 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:53.510821 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-103.ec2.internal\" not found" Apr 24 22:29:53.513897 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.513876 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-103.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-103.ec2.internal"] Apr 24 22:29:53.513972 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.513941 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:53.514670 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.514654 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-103.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:53.514762 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.514681 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:53.514762 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.514691 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-103.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:53.515793 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.515781 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:53.515938 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.515924 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-103.ec2.internal" Apr 24 22:29:53.515993 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.515955 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:53.516403 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.516390 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-103.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:53.516464 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.516406 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-103.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:53.516464 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.516429 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:53.516464 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.516441 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-103.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:53.516575 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.516412 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:53.516575 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.516487 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-103.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:53.517429 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.517416 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-103.ec2.internal" Apr 24 22:29:53.517478 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.517437 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:53.518045 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.518031 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-103.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:53.518122 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.518059 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-103.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:53.518122 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.518073 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-103.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:53.540646 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:53.540627 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-103.ec2.internal\" not found" node="ip-10-0-137-103.ec2.internal" Apr 24 22:29:53.544797 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:53.544783 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-103.ec2.internal\" not found" node="ip-10-0-137-103.ec2.internal" Apr 24 22:29:53.611459 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:53.611440 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-103.ec2.internal\" not found" Apr 24 22:29:53.689106 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.689063 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/967fd0e774275622d49edbbb2962b348-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-103.ec2.internal\" (UID: \"967fd0e774275622d49edbbb2962b348\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-103.ec2.internal" Apr 24 22:29:53.689106 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.689088 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/02aaeaca2e7902be6bed83a0bfe87afe-config\") pod \"kube-apiserver-proxy-ip-10-0-137-103.ec2.internal\" (UID: \"02aaeaca2e7902be6bed83a0bfe87afe\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-103.ec2.internal" Apr 24 22:29:53.689209 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.689104 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/967fd0e774275622d49edbbb2962b348-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-103.ec2.internal\" (UID: \"967fd0e774275622d49edbbb2962b348\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-103.ec2.internal" Apr 24 22:29:53.712397 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:53.712378 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-103.ec2.internal\" not found" Apr 24 22:29:53.789815 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.789790 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/967fd0e774275622d49edbbb2962b348-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-103.ec2.internal\" (UID: \"967fd0e774275622d49edbbb2962b348\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-103.ec2.internal" Apr 24 22:29:53.789909 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.789816 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/967fd0e774275622d49edbbb2962b348-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-103.ec2.internal\" (UID: \"967fd0e774275622d49edbbb2962b348\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-103.ec2.internal" Apr 24 22:29:53.789909 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.789836 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/02aaeaca2e7902be6bed83a0bfe87afe-config\") pod \"kube-apiserver-proxy-ip-10-0-137-103.ec2.internal\" (UID: \"02aaeaca2e7902be6bed83a0bfe87afe\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-103.ec2.internal" Apr 24 22:29:53.789909 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.789880 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/967fd0e774275622d49edbbb2962b348-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-103.ec2.internal\" (UID: \"967fd0e774275622d49edbbb2962b348\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-103.ec2.internal" Apr 24 22:29:53.790037 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.789924 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/02aaeaca2e7902be6bed83a0bfe87afe-config\") pod \"kube-apiserver-proxy-ip-10-0-137-103.ec2.internal\" (UID: \"02aaeaca2e7902be6bed83a0bfe87afe\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-103.ec2.internal" Apr 24 22:29:53.790037 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.789947 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/967fd0e774275622d49edbbb2962b348-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-103.ec2.internal\" (UID: \"967fd0e774275622d49edbbb2962b348\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-103.ec2.internal" Apr 24 22:29:53.813262 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:53.813242 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-103.ec2.internal\" not found" Apr 24 22:29:53.842424 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.842405 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-103.ec2.internal" Apr 24 22:29:53.847882 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:53.847866 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-103.ec2.internal" Apr 24 22:29:53.913786 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:53.913761 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-103.ec2.internal\" not found" Apr 24 22:29:54.014330 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:54.014306 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-103.ec2.internal\" not found" Apr 24 22:29:54.114897 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:54.114873 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-103.ec2.internal\" not found" Apr 24 22:29:54.202357 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:54.202333 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 22:29:54.215474 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:54.215455 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-103.ec2.internal\" not found" Apr 24 22:29:54.286988 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:54.286961 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 22:29:54.315234 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:54.315207 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 22:29:54.315507 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:54.315491 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-103.ec2.internal\" not found" Apr 24 22:29:54.349180 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:54.349157 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02aaeaca2e7902be6bed83a0bfe87afe.slice/crio-512d3ccc290d1a9c2672b780f511c5e22074a8901a0ade6d46ea3d9e4f8da999 WatchSource:0}: Error finding container 512d3ccc290d1a9c2672b780f511c5e22074a8901a0ade6d46ea3d9e4f8da999: Status 404 returned error can't find the container with id 512d3ccc290d1a9c2672b780f511c5e22074a8901a0ade6d46ea3d9e4f8da999 Apr 24 22:29:54.349410 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:54.349390 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod967fd0e774275622d49edbbb2962b348.slice/crio-e647f94d05b9f90639387f05235eb655c108de3f517349b5bdc0492a8a77461c WatchSource:0}: Error finding container e647f94d05b9f90639387f05235eb655c108de3f517349b5bdc0492a8a77461c: Status 404 returned error can't find the container with id e647f94d05b9f90639387f05235eb655c108de3f517349b5bdc0492a8a77461c Apr 24 22:29:54.353245 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:54.353231 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:29:54.371024 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:54.370991 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-d9clk" Apr 24 22:29:54.372149 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:54.372133 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:54.380652 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:54.380631 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-d9clk" Apr 24 22:29:54.415594 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:54.415574 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-103.ec2.internal\" not found" Apr 24 22:29:54.416070 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:54.416036 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-103.ec2.internal" event={"ID":"967fd0e774275622d49edbbb2962b348","Type":"ContainerStarted","Data":"e647f94d05b9f90639387f05235eb655c108de3f517349b5bdc0492a8a77461c"} Apr 24 22:29:54.417046 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:54.417026 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-103.ec2.internal" event={"ID":"02aaeaca2e7902be6bed83a0bfe87afe","Type":"ContainerStarted","Data":"512d3ccc290d1a9c2672b780f511c5e22074a8901a0ade6d46ea3d9e4f8da999"} Apr 24 22:29:54.437213 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:54.437190 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 22:24:53 +0000 UTC" deadline="2028-01-26 04:26:03.556679548 +0000 UTC" Apr 24 22:29:54.437213 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:54.437209 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15389h56m9.119473515s" Apr 24 22:29:54.516661 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:54.516617 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-103.ec2.internal\" not found" Apr 24 22:29:54.617184 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:54.617162 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-103.ec2.internal\" not found" Apr 24 22:29:54.713675 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:54.713646 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:54.738296 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:54.738277 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:54.787039 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:54.786816 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-103.ec2.internal" Apr 24 22:29:54.811516 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:54.811492 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 22:29:54.811617 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:54.811609 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-103.ec2.internal" Apr 24 22:29:54.822266 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:54.822242 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 22:29:54.837143 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:54.837123 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:55.267976 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.267902 2572 apiserver.go:52] "Watching apiserver" Apr 24 22:29:55.277945 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.277919 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 22:29:55.278385 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.278361 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-x9trm","kube-system/kube-apiserver-proxy-ip-10-0-137-103.ec2.internal","openshift-cluster-node-tuning-operator/tuned-4hbpm","openshift-dns/node-resolver-d44dl","openshift-image-registry/node-ca-qgt9r","openshift-multus/multus-t9drx","openshift-multus/network-metrics-daemon-v4wwp","openshift-network-diagnostics/network-check-target-6q8zg","openshift-network-operator/iptables-alerter-rsk2h","kube-system/konnectivity-agent-pqcxg","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-103.ec2.internal","openshift-multus/multus-additional-cni-plugins-l57fj"] Apr 24 22:29:55.281042 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.281023 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:29:55.281121 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:55.281095 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4wwp" podUID="edbc33b8-02e4-43d1-a683-6dcd726340b7" Apr 24 22:29:55.282753 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.282733 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.284464 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.284445 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d44dl" Apr 24 22:29:55.286219 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.286199 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qgt9r" Apr 24 22:29:55.288223 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.288203 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.289242 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.289221 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 22:29:55.289434 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.289410 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-pqpxh\"" Apr 24 22:29:55.289847 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.289682 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 22:29:55.289940 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.289925 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:29:55.290279 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.290262 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.292193 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.292176 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:29:55.292265 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:55.292228 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6q8zg" podUID="6a2536b6-b046-4594-8305-33498ed4dadd" Apr 24 22:29:55.294301 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.294278 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 22:29:55.294396 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.294381 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 22:29:55.294396 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.294390 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 22:29:55.294481 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.294393 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 22:29:55.294533 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.294482 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 22:29:55.294533 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.294487 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 22:29:55.294666 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.294653 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 22:29:55.295392 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.295372 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rsk2h" Apr 24 22:29:55.295575 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.295549 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 22:29:55.295965 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.295667 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 22:29:55.295965 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.295671 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 22:29:55.295965 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.295725 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 22:29:55.295965 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.295856 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-sys\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.295965 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.295887 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-multus-cni-dir\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.295965 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.295914 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-host-run-multus-certs\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.297439 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.296331 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1e6d3a55-047e-4c45-a923-7fa5ce12912c-serviceca\") pod \"node-ca-qgt9r\" (UID: \"1e6d3a55-047e-4c45-a923-7fa5ce12912c\") " pod="openshift-image-registry/node-ca-qgt9r" Apr 24 22:29:55.297439 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.296384 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-host-var-lib-kubelet\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.297439 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.296402 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-run\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.297439 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.296424 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e66ee62e-3c52-4dc3-b9b5-336aaebbd397-hosts-file\") pod \"node-resolver-d44dl\" (UID: \"e66ee62e-3c52-4dc3-b9b5-336aaebbd397\") " pod="openshift-dns/node-resolver-d44dl" Apr 24 22:29:55.297439 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.296519 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-host-run-k8s-cni-cncf-io\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.297439 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.296626 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-host-var-lib-cni-bin\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.297439 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.296658 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-etc-kubernetes\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.297439 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.296697 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-etc-modprobe-d\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.297439 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.296729 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-etc-kubernetes\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.297439 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.296750 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 22:29:55.297439 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.296764 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-wdlhb\"" Apr 24 22:29:55.297439 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.296792 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/221dffa5-7757-4e46-94b6-caf50b41f29e-etc-tuned\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.297439 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.296831 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44nx6\" (UniqueName: \"kubernetes.io/projected/221dffa5-7757-4e46-94b6-caf50b41f29e-kube-api-access-44nx6\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.297439 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.296859 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9h69\" (UniqueName: \"kubernetes.io/projected/e66ee62e-3c52-4dc3-b9b5-336aaebbd397-kube-api-access-n9h69\") pod \"node-resolver-d44dl\" (UID: \"e66ee62e-3c52-4dc3-b9b5-336aaebbd397\") " pod="openshift-dns/node-resolver-d44dl" Apr 24 22:29:55.297439 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.296891 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdxzb\" (UniqueName: \"kubernetes.io/projected/edbc33b8-02e4-43d1-a683-6dcd726340b7-kube-api-access-bdxzb\") pod \"network-metrics-daemon-v4wwp\" (UID: \"edbc33b8-02e4-43d1-a683-6dcd726340b7\") " pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:29:55.297439 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.296928 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv482\" (UniqueName: \"kubernetes.io/projected/1e6d3a55-047e-4c45-a923-7fa5ce12912c-kube-api-access-kv482\") pod \"node-ca-qgt9r\" (UID: \"1e6d3a55-047e-4c45-a923-7fa5ce12912c\") " pod="openshift-image-registry/node-ca-qgt9r" Apr 24 22:29:55.297439 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.297054 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-os-release\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.297439 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.297074 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-8lpxp\"" Apr 24 22:29:55.298355 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.297086 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-host-run-netns\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.298355 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.297110 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-hostroot\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.298355 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.297141 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edbc33b8-02e4-43d1-a683-6dcd726340b7-metrics-certs\") pod \"network-metrics-daemon-v4wwp\" (UID: \"edbc33b8-02e4-43d1-a683-6dcd726340b7\") " pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:29:55.298355 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.297181 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-host\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.298355 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.297186 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 22:29:55.298355 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.297215 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-etc-systemd\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.298355 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.297245 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-var-lib-kubelet\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.298355 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.297264 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 22:29:55.298355 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.297323 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/221dffa5-7757-4e46-94b6-caf50b41f29e-tmp\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.298355 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.297371 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-system-cni-dir\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.298355 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.297408 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-cnibin\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.298355 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.297446 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d399b0dc-b56e-4c25-8058-11c529fe99f7-cni-binary-copy\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.298355 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.297525 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-multus-socket-dir-parent\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.298355 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.297602 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-etc-sysctl-d\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.298355 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.297637 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e6d3a55-047e-4c45-a923-7fa5ce12912c-host\") pod \"node-ca-qgt9r\" (UID: \"1e6d3a55-047e-4c45-a923-7fa5ce12912c\") " pod="openshift-image-registry/node-ca-qgt9r" Apr 24 22:29:55.298355 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.297662 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-host-var-lib-cni-multus\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.298355 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.297711 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hnc5\" (UniqueName: \"kubernetes.io/projected/d399b0dc-b56e-4c25-8058-11c529fe99f7-kube-api-access-7hnc5\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.298355 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.297739 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-etc-sysconfig\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.299114 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.297778 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-lib-modules\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.299114 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.297792 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-bcxh4\"" Apr 24 22:29:55.299114 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.297808 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e66ee62e-3c52-4dc3-b9b5-336aaebbd397-tmp-dir\") pod \"node-resolver-d44dl\" (UID: \"e66ee62e-3c52-4dc3-b9b5-336aaebbd397\") " pod="openshift-dns/node-resolver-d44dl" Apr 24 22:29:55.299114 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.297839 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-multus-conf-dir\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.299114 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.297873 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d399b0dc-b56e-4c25-8058-11c529fe99f7-multus-daemon-config\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.299114 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.297905 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-etc-sysctl-conf\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.300433 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.300410 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:29:55.300522 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.300450 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pqcxg" Apr 24 22:29:55.300827 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.300640 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-tlrrk\"" Apr 24 22:29:55.300827 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.300419 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 22:29:55.301789 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.301771 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 22:29:55.301878 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.301848 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-qlvgm\"" Apr 24 22:29:55.302723 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.302702 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" Apr 24 22:29:55.302723 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.302716 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-l57fj" Apr 24 22:29:55.303702 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.303684 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 22:29:55.304201 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.304186 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 22:29:55.304570 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.304558 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-582p9\"" Apr 24 22:29:55.306557 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.306538 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 22:29:55.306677 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.306659 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 22:29:55.306987 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.306969 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 22:29:55.307745 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.307526 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-j87cq\"" Apr 24 22:29:55.307745 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.307569 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-29drs\"" Apr 24 22:29:55.307745 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.307599 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 22:29:55.307745 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.307604 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 22:29:55.381815 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.381792 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 22:24:54 +0000 UTC" deadline="2027-11-25 01:20:23.697910301 +0000 UTC" Apr 24 22:29:55.381815 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.381813 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13898h50m28.316100662s" Apr 24 22:29:55.388312 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.388294 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 22:29:55.398979 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.398960 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-lib-modules\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.399090 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.398987 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e66ee62e-3c52-4dc3-b9b5-336aaebbd397-tmp-dir\") pod \"node-resolver-d44dl\" (UID: \"e66ee62e-3c52-4dc3-b9b5-336aaebbd397\") " pod="openshift-dns/node-resolver-d44dl" Apr 24 22:29:55.399090 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399004 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-multus-conf-dir\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.399090 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399038 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-node-log\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.399090 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399056 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6126be90-f084-4237-b144-cdf6cef066c9-ovn-node-metrics-cert\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.399090 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399074 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-etc-sysctl-conf\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.399328 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399098 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-sys\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.399328 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399135 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-multus-conf-dir\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.399328 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399156 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-sys\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.399328 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399140 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-lib-modules\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.399328 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399167 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-multus-cni-dir\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.399328 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399209 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-host-kubelet\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.399328 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399216 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-multus-cni-dir\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.399328 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399236 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-var-lib-openvswitch\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.399328 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399237 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-etc-sysctl-conf\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.399328 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399264 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-run-openvswitch\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.399328 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399289 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6126be90-f084-4237-b144-cdf6cef066c9-env-overrides\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.399328 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399317 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e66ee62e-3c52-4dc3-b9b5-336aaebbd397-tmp-dir\") pod \"node-resolver-d44dl\" (UID: \"e66ee62e-3c52-4dc3-b9b5-336aaebbd397\") " pod="openshift-dns/node-resolver-d44dl" Apr 24 22:29:55.399328 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399317 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1e6d3a55-047e-4c45-a923-7fa5ce12912c-serviceca\") pod \"node-ca-qgt9r\" (UID: \"1e6d3a55-047e-4c45-a923-7fa5ce12912c\") " pod="openshift-image-registry/node-ca-qgt9r" Apr 24 22:29:55.399862 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399356 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-systemd-units\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.399862 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399383 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b463ac55-0ee6-47ff-891e-bc4bf5b7184b-socket-dir\") pod \"aws-ebs-csi-driver-node-8frbj\" (UID: \"b463ac55-0ee6-47ff-891e-bc4bf5b7184b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" Apr 24 22:29:55.399862 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399408 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b463ac55-0ee6-47ff-891e-bc4bf5b7184b-sys-fs\") pod \"aws-ebs-csi-driver-node-8frbj\" (UID: \"b463ac55-0ee6-47ff-891e-bc4bf5b7184b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" Apr 24 22:29:55.399862 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399434 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2n5d\" (UniqueName: \"kubernetes.io/projected/05911b94-3317-4b23-b34f-f95d4552f61e-kube-api-access-j2n5d\") pod \"iptables-alerter-rsk2h\" (UID: \"05911b94-3317-4b23-b34f-f95d4552f61e\") " pod="openshift-network-operator/iptables-alerter-rsk2h" Apr 24 22:29:55.399862 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399458 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/91b6537a-5c00-4286-854b-be48eb427fe2-cni-binary-copy\") pod \"multus-additional-cni-plugins-l57fj\" (UID: \"91b6537a-5c00-4286-854b-be48eb427fe2\") " pod="openshift-multus/multus-additional-cni-plugins-l57fj" Apr 24 22:29:55.399862 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399496 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-run\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.399862 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399522 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-host-run-k8s-cni-cncf-io\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.399862 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399546 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-host-var-lib-cni-bin\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.399862 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399574 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-log-socket\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.399862 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399592 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-run\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.399862 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399600 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-host-cni-bin\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.399862 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399598 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-host-run-k8s-cni-cncf-io\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.399862 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399625 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-host-var-lib-cni-bin\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.399862 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399635 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs7gt\" (UniqueName: \"kubernetes.io/projected/b463ac55-0ee6-47ff-891e-bc4bf5b7184b-kube-api-access-xs7gt\") pod \"aws-ebs-csi-driver-node-8frbj\" (UID: \"b463ac55-0ee6-47ff-891e-bc4bf5b7184b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" Apr 24 22:29:55.399862 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399660 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1e6d3a55-047e-4c45-a923-7fa5ce12912c-serviceca\") pod \"node-ca-qgt9r\" (UID: \"1e6d3a55-047e-4c45-a923-7fa5ce12912c\") " pod="openshift-image-registry/node-ca-qgt9r" Apr 24 22:29:55.399862 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399670 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/91b6537a-5c00-4286-854b-be48eb427fe2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l57fj\" (UID: \"91b6537a-5c00-4286-854b-be48eb427fe2\") " pod="openshift-multus/multus-additional-cni-plugins-l57fj" Apr 24 22:29:55.399862 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399706 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-etc-modprobe-d\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.400589 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399732 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-etc-kubernetes\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.400589 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399757 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/221dffa5-7757-4e46-94b6-caf50b41f29e-etc-tuned\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.400589 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399783 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44nx6\" (UniqueName: \"kubernetes.io/projected/221dffa5-7757-4e46-94b6-caf50b41f29e-kube-api-access-44nx6\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.400589 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399808 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9h69\" (UniqueName: \"kubernetes.io/projected/e66ee62e-3c52-4dc3-b9b5-336aaebbd397-kube-api-access-n9h69\") pod \"node-resolver-d44dl\" (UID: \"e66ee62e-3c52-4dc3-b9b5-336aaebbd397\") " pod="openshift-dns/node-resolver-d44dl" Apr 24 22:29:55.400589 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399820 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-etc-modprobe-d\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.400589 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.399830 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-etc-kubernetes\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.400589 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400038 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bdxzb\" (UniqueName: \"kubernetes.io/projected/edbc33b8-02e4-43d1-a683-6dcd726340b7-kube-api-access-bdxzb\") pod \"network-metrics-daemon-v4wwp\" (UID: \"edbc33b8-02e4-43d1-a683-6dcd726340b7\") " pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:29:55.400589 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400076 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 22:29:55.400589 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400072 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-run-ovn\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.400589 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400150 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kv482\" (UniqueName: \"kubernetes.io/projected/1e6d3a55-047e-4c45-a923-7fa5ce12912c-kube-api-access-kv482\") pod \"node-ca-qgt9r\" (UID: \"1e6d3a55-047e-4c45-a923-7fa5ce12912c\") " pod="openshift-image-registry/node-ca-qgt9r" Apr 24 22:29:55.400589 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400175 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-os-release\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.400589 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400196 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-host-run-netns\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.400589 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400219 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edbc33b8-02e4-43d1-a683-6dcd726340b7-metrics-certs\") pod \"network-metrics-daemon-v4wwp\" (UID: \"edbc33b8-02e4-43d1-a683-6dcd726340b7\") " pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:29:55.400589 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400247 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/05911b94-3317-4b23-b34f-f95d4552f61e-host-slash\") pod \"iptables-alerter-rsk2h\" (UID: \"05911b94-3317-4b23-b34f-f95d4552f61e\") " pod="openshift-network-operator/iptables-alerter-rsk2h" Apr 24 22:29:55.400589 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400273 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/987422f7-15db-4df1-8e08-87491e9648dd-agent-certs\") pod \"konnectivity-agent-pqcxg\" (UID: \"987422f7-15db-4df1-8e08-87491e9648dd\") " pod="kube-system/konnectivity-agent-pqcxg" Apr 24 22:29:55.400589 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400300 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/91b6537a-5c00-4286-854b-be48eb427fe2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-l57fj\" (UID: \"91b6537a-5c00-4286-854b-be48eb427fe2\") " pod="openshift-multus/multus-additional-cni-plugins-l57fj" Apr 24 22:29:55.400589 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400313 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-host-run-netns\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.400589 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400325 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-os-release\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.401387 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400345 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-etc-systemd\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.401387 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:55.400402 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:55.401387 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400447 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-etc-systemd\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.401387 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400455 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-var-lib-kubelet\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.401387 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400492 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/221dffa5-7757-4e46-94b6-caf50b41f29e-tmp\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.401387 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400547 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-var-lib-kubelet\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.401387 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:55.400550 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edbc33b8-02e4-43d1-a683-6dcd726340b7-metrics-certs podName:edbc33b8-02e4-43d1-a683-6dcd726340b7 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:55.90049515 +0000 UTC m=+3.048800695 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/edbc33b8-02e4-43d1-a683-6dcd726340b7-metrics-certs") pod "network-metrics-daemon-v4wwp" (UID: "edbc33b8-02e4-43d1-a683-6dcd726340b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:55.401387 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400592 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d399b0dc-b56e-4c25-8058-11c529fe99f7-cni-binary-copy\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.401387 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400617 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-multus-socket-dir-parent\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.401387 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400640 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-host-var-lib-cni-multus\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.401387 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400668 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-host-slash\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.401387 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400692 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-run-systemd\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.401387 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400715 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-etc-sysctl-d\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.401387 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400719 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-host-var-lib-cni-multus\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.401387 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400740 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e6d3a55-047e-4c45-a923-7fa5ce12912c-host\") pod \"node-ca-qgt9r\" (UID: \"1e6d3a55-047e-4c45-a923-7fa5ce12912c\") " pod="openshift-image-registry/node-ca-qgt9r" Apr 24 22:29:55.401387 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400765 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-host-run-netns\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.401387 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400775 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-multus-socket-dir-parent\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.402155 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400817 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-etc-openvswitch\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.402155 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400867 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-etc-sysctl-d\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.402155 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400868 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e6d3a55-047e-4c45-a923-7fa5ce12912c-host\") pod \"node-ca-qgt9r\" (UID: \"1e6d3a55-047e-4c45-a923-7fa5ce12912c\") " pod="openshift-image-registry/node-ca-qgt9r" Apr 24 22:29:55.402155 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400893 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.402155 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400923 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b463ac55-0ee6-47ff-891e-bc4bf5b7184b-etc-selinux\") pod \"aws-ebs-csi-driver-node-8frbj\" (UID: \"b463ac55-0ee6-47ff-891e-bc4bf5b7184b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" Apr 24 22:29:55.402155 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400951 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-etc-sysconfig\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.402155 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.400997 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d399b0dc-b56e-4c25-8058-11c529fe99f7-multus-daemon-config\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.402155 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401040 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/91b6537a-5c00-4286-854b-be48eb427fe2-system-cni-dir\") pod \"multus-additional-cni-plugins-l57fj\" (UID: \"91b6537a-5c00-4286-854b-be48eb427fe2\") " pod="openshift-multus/multus-additional-cni-plugins-l57fj" Apr 24 22:29:55.402155 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401090 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-host-run-multus-certs\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.402155 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401127 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-host-cni-netd\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.402155 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401153 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-host-var-lib-kubelet\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.402155 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401171 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d399b0dc-b56e-4c25-8058-11c529fe99f7-cni-binary-copy\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.402155 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401178 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/91b6537a-5c00-4286-854b-be48eb427fe2-os-release\") pod \"multus-additional-cni-plugins-l57fj\" (UID: \"91b6537a-5c00-4286-854b-be48eb427fe2\") " pod="openshift-multus/multus-additional-cni-plugins-l57fj" Apr 24 22:29:55.402155 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401205 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/91b6537a-5c00-4286-854b-be48eb427fe2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l57fj\" (UID: \"91b6537a-5c00-4286-854b-be48eb427fe2\") " pod="openshift-multus/multus-additional-cni-plugins-l57fj" Apr 24 22:29:55.402155 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401251 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-host-var-lib-kubelet\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.402155 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401260 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-etc-sysconfig\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.402155 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401310 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-host-run-multus-certs\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.402750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401337 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e66ee62e-3c52-4dc3-b9b5-336aaebbd397-hosts-file\") pod \"node-resolver-d44dl\" (UID: \"e66ee62e-3c52-4dc3-b9b5-336aaebbd397\") " pod="openshift-dns/node-resolver-d44dl" Apr 24 22:29:55.402750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401361 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-etc-kubernetes\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.402750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401421 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-etc-kubernetes\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.402750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401439 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e66ee62e-3c52-4dc3-b9b5-336aaebbd397-hosts-file\") pod \"node-resolver-d44dl\" (UID: \"e66ee62e-3c52-4dc3-b9b5-336aaebbd397\") " pod="openshift-dns/node-resolver-d44dl" Apr 24 22:29:55.402750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401481 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b463ac55-0ee6-47ff-891e-bc4bf5b7184b-registration-dir\") pod \"aws-ebs-csi-driver-node-8frbj\" (UID: \"b463ac55-0ee6-47ff-891e-bc4bf5b7184b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" Apr 24 22:29:55.402750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401499 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d399b0dc-b56e-4c25-8058-11c529fe99f7-multus-daemon-config\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.402750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401512 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/987422f7-15db-4df1-8e08-87491e9648dd-konnectivity-ca\") pod \"konnectivity-agent-pqcxg\" (UID: \"987422f7-15db-4df1-8e08-87491e9648dd\") " pod="kube-system/konnectivity-agent-pqcxg" Apr 24 22:29:55.402750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401549 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6126be90-f084-4237-b144-cdf6cef066c9-ovnkube-config\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.402750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401590 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/91b6537a-5c00-4286-854b-be48eb427fe2-cnibin\") pod \"multus-additional-cni-plugins-l57fj\" (UID: \"91b6537a-5c00-4286-854b-be48eb427fe2\") " pod="openshift-multus/multus-additional-cni-plugins-l57fj" Apr 24 22:29:55.402750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401661 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-hostroot\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.402750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401695 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-host-run-ovn-kubernetes\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.402750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401741 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-hostroot\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.402750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401771 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6126be90-f084-4237-b144-cdf6cef066c9-ovnkube-script-lib\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.402750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401834 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4qx6\" (UniqueName: \"kubernetes.io/projected/6126be90-f084-4237-b144-cdf6cef066c9-kube-api-access-r4qx6\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.402750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401862 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b463ac55-0ee6-47ff-891e-bc4bf5b7184b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8frbj\" (UID: \"b463ac55-0ee6-47ff-891e-bc4bf5b7184b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" Apr 24 22:29:55.402750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401888 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b463ac55-0ee6-47ff-891e-bc4bf5b7184b-device-dir\") pod \"aws-ebs-csi-driver-node-8frbj\" (UID: \"b463ac55-0ee6-47ff-891e-bc4bf5b7184b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" Apr 24 22:29:55.402750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401933 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-host\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.403298 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.401989 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-system-cni-dir\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.403298 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.402050 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-cnibin\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.403298 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.402064 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-system-cni-dir\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.403298 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.402076 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hnc5\" (UniqueName: \"kubernetes.io/projected/d399b0dc-b56e-4c25-8058-11c529fe99f7-kube-api-access-7hnc5\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.403298 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.402126 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtlkc\" (UniqueName: \"kubernetes.io/projected/6a2536b6-b046-4594-8305-33498ed4dadd-kube-api-access-mtlkc\") pod \"network-check-target-6q8zg\" (UID: \"6a2536b6-b046-4594-8305-33498ed4dadd\") " pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:29:55.403298 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.402139 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d399b0dc-b56e-4c25-8058-11c529fe99f7-cnibin\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.403298 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.402000 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/221dffa5-7757-4e46-94b6-caf50b41f29e-host\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.403298 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.402153 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/05911b94-3317-4b23-b34f-f95d4552f61e-iptables-alerter-script\") pod \"iptables-alerter-rsk2h\" (UID: \"05911b94-3317-4b23-b34f-f95d4552f61e\") " pod="openshift-network-operator/iptables-alerter-rsk2h" Apr 24 22:29:55.403298 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.402267 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mwxd\" (UniqueName: \"kubernetes.io/projected/91b6537a-5c00-4286-854b-be48eb427fe2-kube-api-access-9mwxd\") pod \"multus-additional-cni-plugins-l57fj\" (UID: \"91b6537a-5c00-4286-854b-be48eb427fe2\") " pod="openshift-multus/multus-additional-cni-plugins-l57fj" Apr 24 22:29:55.403298 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.402669 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/221dffa5-7757-4e46-94b6-caf50b41f29e-etc-tuned\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.403298 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.402788 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/221dffa5-7757-4e46-94b6-caf50b41f29e-tmp\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.421988 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.421952 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9h69\" (UniqueName: \"kubernetes.io/projected/e66ee62e-3c52-4dc3-b9b5-336aaebbd397-kube-api-access-n9h69\") pod \"node-resolver-d44dl\" (UID: \"e66ee62e-3c52-4dc3-b9b5-336aaebbd397\") " pod="openshift-dns/node-resolver-d44dl" Apr 24 22:29:55.421988 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.421986 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44nx6\" (UniqueName: \"kubernetes.io/projected/221dffa5-7757-4e46-94b6-caf50b41f29e-kube-api-access-44nx6\") pod \"tuned-4hbpm\" (UID: \"221dffa5-7757-4e46-94b6-caf50b41f29e\") " pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.422307 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.422287 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hnc5\" (UniqueName: \"kubernetes.io/projected/d399b0dc-b56e-4c25-8058-11c529fe99f7-kube-api-access-7hnc5\") pod \"multus-t9drx\" (UID: \"d399b0dc-b56e-4c25-8058-11c529fe99f7\") " pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.423353 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.423337 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdxzb\" (UniqueName: \"kubernetes.io/projected/edbc33b8-02e4-43d1-a683-6dcd726340b7-kube-api-access-bdxzb\") pod \"network-metrics-daemon-v4wwp\" (UID: \"edbc33b8-02e4-43d1-a683-6dcd726340b7\") " pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:29:55.423974 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.423942 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv482\" (UniqueName: \"kubernetes.io/projected/1e6d3a55-047e-4c45-a923-7fa5ce12912c-kube-api-access-kv482\") pod \"node-ca-qgt9r\" (UID: \"1e6d3a55-047e-4c45-a923-7fa5ce12912c\") " pod="openshift-image-registry/node-ca-qgt9r" Apr 24 22:29:55.502649 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.502609 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6126be90-f084-4237-b144-cdf6cef066c9-ovn-node-metrics-cert\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.502784 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.502655 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-host-kubelet\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.502784 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.502680 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-var-lib-openvswitch\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.502784 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.502703 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-run-openvswitch\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.502784 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.502727 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6126be90-f084-4237-b144-cdf6cef066c9-env-overrides\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.502784 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.502752 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-systemd-units\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.502784 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.502775 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b463ac55-0ee6-47ff-891e-bc4bf5b7184b-socket-dir\") pod \"aws-ebs-csi-driver-node-8frbj\" (UID: \"b463ac55-0ee6-47ff-891e-bc4bf5b7184b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" Apr 24 22:29:55.503081 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.502781 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-var-lib-openvswitch\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.503081 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.502798 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b463ac55-0ee6-47ff-891e-bc4bf5b7184b-sys-fs\") pod \"aws-ebs-csi-driver-node-8frbj\" (UID: \"b463ac55-0ee6-47ff-891e-bc4bf5b7184b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" Apr 24 22:29:55.503081 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.502839 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2n5d\" (UniqueName: \"kubernetes.io/projected/05911b94-3317-4b23-b34f-f95d4552f61e-kube-api-access-j2n5d\") pod \"iptables-alerter-rsk2h\" (UID: \"05911b94-3317-4b23-b34f-f95d4552f61e\") " pod="openshift-network-operator/iptables-alerter-rsk2h" Apr 24 22:29:55.503081 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.502844 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-host-kubelet\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.503081 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.502858 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b463ac55-0ee6-47ff-891e-bc4bf5b7184b-sys-fs\") pod \"aws-ebs-csi-driver-node-8frbj\" (UID: \"b463ac55-0ee6-47ff-891e-bc4bf5b7184b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" Apr 24 22:29:55.503081 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.502867 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/91b6537a-5c00-4286-854b-be48eb427fe2-cni-binary-copy\") pod \"multus-additional-cni-plugins-l57fj\" (UID: \"91b6537a-5c00-4286-854b-be48eb427fe2\") " pod="openshift-multus/multus-additional-cni-plugins-l57fj" Apr 24 22:29:55.503081 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.502927 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-systemd-units\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.503081 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.502927 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-run-openvswitch\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.503081 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503045 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-log-socket\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.503478 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503083 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-log-socket\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.503478 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503044 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b463ac55-0ee6-47ff-891e-bc4bf5b7184b-socket-dir\") pod \"aws-ebs-csi-driver-node-8frbj\" (UID: \"b463ac55-0ee6-47ff-891e-bc4bf5b7184b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" Apr 24 22:29:55.503478 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503133 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-host-cni-bin\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.503478 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503073 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-host-cni-bin\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.503478 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503178 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xs7gt\" (UniqueName: \"kubernetes.io/projected/b463ac55-0ee6-47ff-891e-bc4bf5b7184b-kube-api-access-xs7gt\") pod \"aws-ebs-csi-driver-node-8frbj\" (UID: \"b463ac55-0ee6-47ff-891e-bc4bf5b7184b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" Apr 24 22:29:55.503478 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503205 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/91b6537a-5c00-4286-854b-be48eb427fe2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l57fj\" (UID: \"91b6537a-5c00-4286-854b-be48eb427fe2\") " pod="openshift-multus/multus-additional-cni-plugins-l57fj" Apr 24 22:29:55.503478 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503240 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-run-ovn\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.503478 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503280 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/05911b94-3317-4b23-b34f-f95d4552f61e-host-slash\") pod \"iptables-alerter-rsk2h\" (UID: \"05911b94-3317-4b23-b34f-f95d4552f61e\") " pod="openshift-network-operator/iptables-alerter-rsk2h" Apr 24 22:29:55.503478 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503305 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/987422f7-15db-4df1-8e08-87491e9648dd-agent-certs\") pod \"konnectivity-agent-pqcxg\" (UID: \"987422f7-15db-4df1-8e08-87491e9648dd\") " pod="kube-system/konnectivity-agent-pqcxg" Apr 24 22:29:55.503478 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503331 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/91b6537a-5c00-4286-854b-be48eb427fe2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-l57fj\" (UID: \"91b6537a-5c00-4286-854b-be48eb427fe2\") " pod="openshift-multus/multus-additional-cni-plugins-l57fj" Apr 24 22:29:55.503478 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503356 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6126be90-f084-4237-b144-cdf6cef066c9-env-overrides\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.503478 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503385 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-run-ovn\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.503478 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503362 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/05911b94-3317-4b23-b34f-f95d4552f61e-host-slash\") pod \"iptables-alerter-rsk2h\" (UID: \"05911b94-3317-4b23-b34f-f95d4552f61e\") " pod="openshift-network-operator/iptables-alerter-rsk2h" Apr 24 22:29:55.503478 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503362 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-host-slash\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.503478 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503418 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-host-slash\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.503478 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503426 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-run-systemd\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.503478 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503447 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-run-systemd\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.504263 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503463 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-host-run-netns\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.504263 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503492 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-etc-openvswitch\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.504263 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503497 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-host-run-netns\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.504263 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503542 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.504263 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503570 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b463ac55-0ee6-47ff-891e-bc4bf5b7184b-etc-selinux\") pod \"aws-ebs-csi-driver-node-8frbj\" (UID: \"b463ac55-0ee6-47ff-891e-bc4bf5b7184b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" Apr 24 22:29:55.504263 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503599 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/91b6537a-5c00-4286-854b-be48eb427fe2-system-cni-dir\") pod \"multus-additional-cni-plugins-l57fj\" (UID: \"91b6537a-5c00-4286-854b-be48eb427fe2\") " pod="openshift-multus/multus-additional-cni-plugins-l57fj" Apr 24 22:29:55.504263 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503619 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.504263 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503626 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-host-cni-netd\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.504263 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503635 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b463ac55-0ee6-47ff-891e-bc4bf5b7184b-etc-selinux\") pod \"aws-ebs-csi-driver-node-8frbj\" (UID: \"b463ac55-0ee6-47ff-891e-bc4bf5b7184b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" Apr 24 22:29:55.504263 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503573 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-etc-openvswitch\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.504263 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503659 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-host-cni-netd\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.504263 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503663 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/91b6537a-5c00-4286-854b-be48eb427fe2-os-release\") pod \"multus-additional-cni-plugins-l57fj\" (UID: \"91b6537a-5c00-4286-854b-be48eb427fe2\") " pod="openshift-multus/multus-additional-cni-plugins-l57fj" Apr 24 22:29:55.504263 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503696 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/91b6537a-5c00-4286-854b-be48eb427fe2-system-cni-dir\") pod \"multus-additional-cni-plugins-l57fj\" (UID: \"91b6537a-5c00-4286-854b-be48eb427fe2\") " pod="openshift-multus/multus-additional-cni-plugins-l57fj" Apr 24 22:29:55.504263 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503718 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/91b6537a-5c00-4286-854b-be48eb427fe2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l57fj\" (UID: \"91b6537a-5c00-4286-854b-be48eb427fe2\") " pod="openshift-multus/multus-additional-cni-plugins-l57fj" Apr 24 22:29:55.504263 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503749 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b463ac55-0ee6-47ff-891e-bc4bf5b7184b-registration-dir\") pod \"aws-ebs-csi-driver-node-8frbj\" (UID: \"b463ac55-0ee6-47ff-891e-bc4bf5b7184b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" Apr 24 22:29:55.504263 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503758 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/91b6537a-5c00-4286-854b-be48eb427fe2-os-release\") pod \"multus-additional-cni-plugins-l57fj\" (UID: \"91b6537a-5c00-4286-854b-be48eb427fe2\") " pod="openshift-multus/multus-additional-cni-plugins-l57fj" Apr 24 22:29:55.504263 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503774 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/987422f7-15db-4df1-8e08-87491e9648dd-konnectivity-ca\") pod \"konnectivity-agent-pqcxg\" (UID: \"987422f7-15db-4df1-8e08-87491e9648dd\") " pod="kube-system/konnectivity-agent-pqcxg" Apr 24 22:29:55.504999 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503800 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6126be90-f084-4237-b144-cdf6cef066c9-ovnkube-config\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.504999 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503811 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b463ac55-0ee6-47ff-891e-bc4bf5b7184b-registration-dir\") pod \"aws-ebs-csi-driver-node-8frbj\" (UID: \"b463ac55-0ee6-47ff-891e-bc4bf5b7184b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" Apr 24 22:29:55.504999 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503824 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/91b6537a-5c00-4286-854b-be48eb427fe2-cnibin\") pod \"multus-additional-cni-plugins-l57fj\" (UID: \"91b6537a-5c00-4286-854b-be48eb427fe2\") " pod="openshift-multus/multus-additional-cni-plugins-l57fj" Apr 24 22:29:55.504999 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503850 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-host-run-ovn-kubernetes\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.504999 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503872 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/91b6537a-5c00-4286-854b-be48eb427fe2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l57fj\" (UID: \"91b6537a-5c00-4286-854b-be48eb427fe2\") " pod="openshift-multus/multus-additional-cni-plugins-l57fj" Apr 24 22:29:55.504999 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503998 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/91b6537a-5c00-4286-854b-be48eb427fe2-cni-binary-copy\") pod \"multus-additional-cni-plugins-l57fj\" (UID: \"91b6537a-5c00-4286-854b-be48eb427fe2\") " pod="openshift-multus/multus-additional-cni-plugins-l57fj" Apr 24 22:29:55.504999 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.504072 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/91b6537a-5c00-4286-854b-be48eb427fe2-cnibin\") pod \"multus-additional-cni-plugins-l57fj\" (UID: \"91b6537a-5c00-4286-854b-be48eb427fe2\") " pod="openshift-multus/multus-additional-cni-plugins-l57fj" Apr 24 22:29:55.504999 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.503875 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6126be90-f084-4237-b144-cdf6cef066c9-ovnkube-script-lib\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.504999 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.504113 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4qx6\" (UniqueName: \"kubernetes.io/projected/6126be90-f084-4237-b144-cdf6cef066c9-kube-api-access-r4qx6\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.504999 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.504139 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b463ac55-0ee6-47ff-891e-bc4bf5b7184b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8frbj\" (UID: \"b463ac55-0ee6-47ff-891e-bc4bf5b7184b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" Apr 24 22:29:55.504999 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.504164 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b463ac55-0ee6-47ff-891e-bc4bf5b7184b-device-dir\") pod \"aws-ebs-csi-driver-node-8frbj\" (UID: \"b463ac55-0ee6-47ff-891e-bc4bf5b7184b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" Apr 24 22:29:55.504999 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.504193 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtlkc\" (UniqueName: \"kubernetes.io/projected/6a2536b6-b046-4594-8305-33498ed4dadd-kube-api-access-mtlkc\") pod \"network-check-target-6q8zg\" (UID: \"6a2536b6-b046-4594-8305-33498ed4dadd\") " pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:29:55.504999 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.504219 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/05911b94-3317-4b23-b34f-f95d4552f61e-iptables-alerter-script\") pod \"iptables-alerter-rsk2h\" (UID: \"05911b94-3317-4b23-b34f-f95d4552f61e\") " pod="openshift-network-operator/iptables-alerter-rsk2h" Apr 24 22:29:55.504999 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.504246 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mwxd\" (UniqueName: \"kubernetes.io/projected/91b6537a-5c00-4286-854b-be48eb427fe2-kube-api-access-9mwxd\") pod \"multus-additional-cni-plugins-l57fj\" (UID: \"91b6537a-5c00-4286-854b-be48eb427fe2\") " pod="openshift-multus/multus-additional-cni-plugins-l57fj" Apr 24 22:29:55.504999 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.504280 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-node-log\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.504999 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.504326 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/987422f7-15db-4df1-8e08-87491e9648dd-konnectivity-ca\") pod \"konnectivity-agent-pqcxg\" (UID: \"987422f7-15db-4df1-8e08-87491e9648dd\") " pod="kube-system/konnectivity-agent-pqcxg" Apr 24 22:29:55.504999 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.504344 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-node-log\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.505611 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.504379 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6126be90-f084-4237-b144-cdf6cef066c9-host-run-ovn-kubernetes\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.505611 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.504386 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6126be90-f084-4237-b144-cdf6cef066c9-ovnkube-config\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.505611 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.504421 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b463ac55-0ee6-47ff-891e-bc4bf5b7184b-device-dir\") pod \"aws-ebs-csi-driver-node-8frbj\" (UID: \"b463ac55-0ee6-47ff-891e-bc4bf5b7184b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" Apr 24 22:29:55.505611 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.504438 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6126be90-f084-4237-b144-cdf6cef066c9-ovnkube-script-lib\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.505611 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.504483 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b463ac55-0ee6-47ff-891e-bc4bf5b7184b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8frbj\" (UID: \"b463ac55-0ee6-47ff-891e-bc4bf5b7184b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" Apr 24 22:29:55.505611 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.504902 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/05911b94-3317-4b23-b34f-f95d4552f61e-iptables-alerter-script\") pod \"iptables-alerter-rsk2h\" (UID: \"05911b94-3317-4b23-b34f-f95d4552f61e\") " pod="openshift-network-operator/iptables-alerter-rsk2h" Apr 24 22:29:55.505611 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.505121 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/91b6537a-5c00-4286-854b-be48eb427fe2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-l57fj\" (UID: \"91b6537a-5c00-4286-854b-be48eb427fe2\") " pod="openshift-multus/multus-additional-cni-plugins-l57fj" Apr 24 22:29:55.505611 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.505223 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/91b6537a-5c00-4286-854b-be48eb427fe2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l57fj\" (UID: \"91b6537a-5c00-4286-854b-be48eb427fe2\") " pod="openshift-multus/multus-additional-cni-plugins-l57fj" Apr 24 22:29:55.505611 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.505269 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6126be90-f084-4237-b144-cdf6cef066c9-ovn-node-metrics-cert\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.505611 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.505549 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/987422f7-15db-4df1-8e08-87491e9648dd-agent-certs\") pod \"konnectivity-agent-pqcxg\" (UID: \"987422f7-15db-4df1-8e08-87491e9648dd\") " pod="kube-system/konnectivity-agent-pqcxg" Apr 24 22:29:55.518103 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:55.518043 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:55.518103 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:55.518071 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:55.518103 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:55.518085 2572 projected.go:194] Error preparing data for projected volume kube-api-access-mtlkc for pod openshift-network-diagnostics/network-check-target-6q8zg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:55.518311 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:55.518156 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6a2536b6-b046-4594-8305-33498ed4dadd-kube-api-access-mtlkc podName:6a2536b6-b046-4594-8305-33498ed4dadd nodeName:}" failed. No retries permitted until 2026-04-24 22:29:56.018139715 +0000 UTC m=+3.166445242 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mtlkc" (UniqueName: "kubernetes.io/projected/6a2536b6-b046-4594-8305-33498ed4dadd-kube-api-access-mtlkc") pod "network-check-target-6q8zg" (UID: "6a2536b6-b046-4594-8305-33498ed4dadd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:55.520217 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.520197 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2n5d\" (UniqueName: \"kubernetes.io/projected/05911b94-3317-4b23-b34f-f95d4552f61e-kube-api-access-j2n5d\") pod \"iptables-alerter-rsk2h\" (UID: \"05911b94-3317-4b23-b34f-f95d4552f61e\") " pod="openshift-network-operator/iptables-alerter-rsk2h" Apr 24 22:29:55.520492 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.520472 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4qx6\" (UniqueName: \"kubernetes.io/projected/6126be90-f084-4237-b144-cdf6cef066c9-kube-api-access-r4qx6\") pod \"ovnkube-node-x9trm\" (UID: \"6126be90-f084-4237-b144-cdf6cef066c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.520814 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.520794 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs7gt\" (UniqueName: \"kubernetes.io/projected/b463ac55-0ee6-47ff-891e-bc4bf5b7184b-kube-api-access-xs7gt\") pod \"aws-ebs-csi-driver-node-8frbj\" (UID: \"b463ac55-0ee6-47ff-891e-bc4bf5b7184b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" Apr 24 22:29:55.521030 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.520995 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mwxd\" (UniqueName: \"kubernetes.io/projected/91b6537a-5c00-4286-854b-be48eb427fe2-kube-api-access-9mwxd\") pod \"multus-additional-cni-plugins-l57fj\" (UID: \"91b6537a-5c00-4286-854b-be48eb427fe2\") " pod="openshift-multus/multus-additional-cni-plugins-l57fj" Apr 24 22:29:55.594701 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.594677 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" Apr 24 22:29:55.602210 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.602191 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d44dl" Apr 24 22:29:55.611876 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.611855 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qgt9r" Apr 24 22:29:55.615499 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.615481 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-t9drx" Apr 24 22:29:55.623157 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.623140 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:29:55.629709 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.629691 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rsk2h" Apr 24 22:29:55.636122 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.636100 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pqcxg" Apr 24 22:29:55.641602 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.641584 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" Apr 24 22:29:55.647086 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.647070 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-l57fj" Apr 24 22:29:55.907925 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:55.907864 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edbc33b8-02e4-43d1-a683-6dcd726340b7-metrics-certs\") pod \"network-metrics-daemon-v4wwp\" (UID: \"edbc33b8-02e4-43d1-a683-6dcd726340b7\") " pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:29:55.908065 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:55.907980 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:55.908065 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:55.908044 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edbc33b8-02e4-43d1-a683-6dcd726340b7-metrics-certs podName:edbc33b8-02e4-43d1-a683-6dcd726340b7 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:56.908027718 +0000 UTC m=+4.056333250 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/edbc33b8-02e4-43d1-a683-6dcd726340b7-metrics-certs") pod "network-metrics-daemon-v4wwp" (UID: "edbc33b8-02e4-43d1-a683-6dcd726340b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:55.960634 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:55.960418 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb463ac55_0ee6_47ff_891e_bc4bf5b7184b.slice/crio-e00137e1b3078baab3657753737422e989a424b7d96a737c6d66514c9703f414 WatchSource:0}: Error finding container e00137e1b3078baab3657753737422e989a424b7d96a737c6d66514c9703f414: Status 404 returned error can't find the container with id e00137e1b3078baab3657753737422e989a424b7d96a737c6d66514c9703f414 Apr 24 22:29:55.961739 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:55.961717 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6126be90_f084_4237_b144_cdf6cef066c9.slice/crio-455c45ea552acec893ab36f0de3933387bdbfb57d97a326f0e3590ecec25eaf8 WatchSource:0}: Error finding container 455c45ea552acec893ab36f0de3933387bdbfb57d97a326f0e3590ecec25eaf8: Status 404 returned error can't find the container with id 455c45ea552acec893ab36f0de3933387bdbfb57d97a326f0e3590ecec25eaf8 Apr 24 22:29:55.964348 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:55.964325 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd399b0dc_b56e_4c25_8058_11c529fe99f7.slice/crio-df4cc23d6b004f1605b92dca6d5c905f8fb5140dada81df33f9a64cbf5066dfd WatchSource:0}: Error finding container df4cc23d6b004f1605b92dca6d5c905f8fb5140dada81df33f9a64cbf5066dfd: Status 404 returned error can't find the container with id df4cc23d6b004f1605b92dca6d5c905f8fb5140dada81df33f9a64cbf5066dfd Apr 24 22:29:55.967381 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:55.967358 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05911b94_3317_4b23_b34f_f95d4552f61e.slice/crio-98a628234838cd13829deadac3c487c7ce995066753eea17b42682d790b75073 WatchSource:0}: Error finding container 98a628234838cd13829deadac3c487c7ce995066753eea17b42682d790b75073: Status 404 returned error can't find the container with id 98a628234838cd13829deadac3c487c7ce995066753eea17b42682d790b75073 Apr 24 22:29:55.968198 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:55.968166 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod221dffa5_7757_4e46_94b6_caf50b41f29e.slice/crio-bebc70dc64738715cd85c07a4e445fde7383778b14013c7a79fe734b7e1e7703 WatchSource:0}: Error finding container bebc70dc64738715cd85c07a4e445fde7383778b14013c7a79fe734b7e1e7703: Status 404 returned error can't find the container with id bebc70dc64738715cd85c07a4e445fde7383778b14013c7a79fe734b7e1e7703 Apr 24 22:29:55.968957 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:55.968929 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e6d3a55_047e_4c45_a923_7fa5ce12912c.slice/crio-9b850860986f61e742990a435912a4fef95c3561768af54e91ffce5e240791ce WatchSource:0}: Error finding container 9b850860986f61e742990a435912a4fef95c3561768af54e91ffce5e240791ce: Status 404 returned error can't find the container with id 9b850860986f61e742990a435912a4fef95c3561768af54e91ffce5e240791ce Apr 24 22:29:55.969896 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:55.969812 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod987422f7_15db_4df1_8e08_87491e9648dd.slice/crio-b87fe10be874a18bb2679fda1902ef0f266bbb7cd97fbf9715102fe668361dc3 WatchSource:0}: Error finding container b87fe10be874a18bb2679fda1902ef0f266bbb7cd97fbf9715102fe668361dc3: Status 404 returned error can't find the container with id b87fe10be874a18bb2679fda1902ef0f266bbb7cd97fbf9715102fe668361dc3 Apr 24 22:29:55.970659 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:29:55.970634 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91b6537a_5c00_4286_854b_be48eb427fe2.slice/crio-f08b5a1a64fe335b2afd2463cd5795801870ae43e0285f66f9e276668433439d WatchSource:0}: Error finding container f08b5a1a64fe335b2afd2463cd5795801870ae43e0285f66f9e276668433439d: Status 404 returned error can't find the container with id f08b5a1a64fe335b2afd2463cd5795801870ae43e0285f66f9e276668433439d Apr 24 22:29:56.109300 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:56.109278 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtlkc\" (UniqueName: \"kubernetes.io/projected/6a2536b6-b046-4594-8305-33498ed4dadd-kube-api-access-mtlkc\") pod \"network-check-target-6q8zg\" (UID: \"6a2536b6-b046-4594-8305-33498ed4dadd\") " pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:29:56.109405 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:56.109393 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:56.109459 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:56.109408 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:56.109459 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:56.109417 2572 projected.go:194] Error preparing data for projected volume kube-api-access-mtlkc for pod openshift-network-diagnostics/network-check-target-6q8zg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:56.109459 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:56.109453 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6a2536b6-b046-4594-8305-33498ed4dadd-kube-api-access-mtlkc podName:6a2536b6-b046-4594-8305-33498ed4dadd nodeName:}" failed. No retries permitted until 2026-04-24 22:29:57.109441694 +0000 UTC m=+4.257747225 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-mtlkc" (UniqueName: "kubernetes.io/projected/6a2536b6-b046-4594-8305-33498ed4dadd-kube-api-access-mtlkc") pod "network-check-target-6q8zg" (UID: "6a2536b6-b046-4594-8305-33498ed4dadd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:56.382781 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:56.382674 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 22:24:54 +0000 UTC" deadline="2028-01-09 11:43:48.62769768 +0000 UTC" Apr 24 22:29:56.382781 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:56.382710 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14989h13m52.244991801s" Apr 24 22:29:56.426456 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:56.426418 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" event={"ID":"6126be90-f084-4237-b144-cdf6cef066c9","Type":"ContainerStarted","Data":"455c45ea552acec893ab36f0de3933387bdbfb57d97a326f0e3590ecec25eaf8"} Apr 24 22:29:56.427836 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:56.427773 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d44dl" event={"ID":"e66ee62e-3c52-4dc3-b9b5-336aaebbd397","Type":"ContainerStarted","Data":"cdb8cac8971cdbe45bff6395cef1420f18dc9443e8e0a9e6af0869929c3444d1"} Apr 24 22:29:56.432150 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:56.432110 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qgt9r" event={"ID":"1e6d3a55-047e-4c45-a923-7fa5ce12912c","Type":"ContainerStarted","Data":"9b850860986f61e742990a435912a4fef95c3561768af54e91ffce5e240791ce"} Apr 24 22:29:56.433871 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:56.433819 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" event={"ID":"221dffa5-7757-4e46-94b6-caf50b41f29e","Type":"ContainerStarted","Data":"bebc70dc64738715cd85c07a4e445fde7383778b14013c7a79fe734b7e1e7703"} Apr 24 22:29:56.436130 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:56.436074 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t9drx" event={"ID":"d399b0dc-b56e-4c25-8058-11c529fe99f7","Type":"ContainerStarted","Data":"df4cc23d6b004f1605b92dca6d5c905f8fb5140dada81df33f9a64cbf5066dfd"} Apr 24 22:29:56.440132 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:56.440093 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" event={"ID":"b463ac55-0ee6-47ff-891e-bc4bf5b7184b","Type":"ContainerStarted","Data":"e00137e1b3078baab3657753737422e989a424b7d96a737c6d66514c9703f414"} Apr 24 22:29:56.450070 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:56.450044 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-103.ec2.internal" event={"ID":"02aaeaca2e7902be6bed83a0bfe87afe","Type":"ContainerStarted","Data":"ceccf102488bf99fa5eb55f313233cb6e922a9f18689992175b982ac7a524e0b"} Apr 24 22:29:56.458216 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:56.458184 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l57fj" event={"ID":"91b6537a-5c00-4286-854b-be48eb427fe2","Type":"ContainerStarted","Data":"f08b5a1a64fe335b2afd2463cd5795801870ae43e0285f66f9e276668433439d"} Apr 24 22:29:56.461233 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:56.461212 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pqcxg" event={"ID":"987422f7-15db-4df1-8e08-87491e9648dd","Type":"ContainerStarted","Data":"b87fe10be874a18bb2679fda1902ef0f266bbb7cd97fbf9715102fe668361dc3"} Apr 24 22:29:56.462863 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:56.462839 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rsk2h" event={"ID":"05911b94-3317-4b23-b34f-f95d4552f61e","Type":"ContainerStarted","Data":"98a628234838cd13829deadac3c487c7ce995066753eea17b42682d790b75073"} Apr 24 22:29:56.916526 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:56.916495 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edbc33b8-02e4-43d1-a683-6dcd726340b7-metrics-certs\") pod \"network-metrics-daemon-v4wwp\" (UID: \"edbc33b8-02e4-43d1-a683-6dcd726340b7\") " pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:29:56.916650 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:56.916632 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:56.916711 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:56.916688 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edbc33b8-02e4-43d1-a683-6dcd726340b7-metrics-certs podName:edbc33b8-02e4-43d1-a683-6dcd726340b7 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:58.916669664 +0000 UTC m=+6.064975205 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/edbc33b8-02e4-43d1-a683-6dcd726340b7-metrics-certs") pod "network-metrics-daemon-v4wwp" (UID: "edbc33b8-02e4-43d1-a683-6dcd726340b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:57.118706 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:57.118627 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtlkc\" (UniqueName: \"kubernetes.io/projected/6a2536b6-b046-4594-8305-33498ed4dadd-kube-api-access-mtlkc\") pod \"network-check-target-6q8zg\" (UID: \"6a2536b6-b046-4594-8305-33498ed4dadd\") " pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:29:57.118826 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:57.118802 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:57.118826 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:57.118821 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:57.118934 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:57.118832 2572 projected.go:194] Error preparing data for projected volume kube-api-access-mtlkc for pod openshift-network-diagnostics/network-check-target-6q8zg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:57.118934 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:57.118884 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6a2536b6-b046-4594-8305-33498ed4dadd-kube-api-access-mtlkc podName:6a2536b6-b046-4594-8305-33498ed4dadd nodeName:}" failed. No retries permitted until 2026-04-24 22:29:59.118867311 +0000 UTC m=+6.267172839 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-mtlkc" (UniqueName: "kubernetes.io/projected/6a2536b6-b046-4594-8305-33498ed4dadd-kube-api-access-mtlkc") pod "network-check-target-6q8zg" (UID: "6a2536b6-b046-4594-8305-33498ed4dadd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:57.414709 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:57.414146 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:29:57.414709 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:57.414264 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6q8zg" podUID="6a2536b6-b046-4594-8305-33498ed4dadd" Apr 24 22:29:57.415457 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:57.415308 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:29:57.415457 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:57.415414 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4wwp" podUID="edbc33b8-02e4-43d1-a683-6dcd726340b7" Apr 24 22:29:57.488043 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:57.487753 2572 generic.go:358] "Generic (PLEG): container finished" podID="967fd0e774275622d49edbbb2962b348" containerID="aa8842df76efec03d18b83ffbdb673d9c734f4dae6182312c6050861ee824aed" exitCode=0 Apr 24 22:29:57.488043 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:57.487853 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-103.ec2.internal" event={"ID":"967fd0e774275622d49edbbb2962b348","Type":"ContainerDied","Data":"aa8842df76efec03d18b83ffbdb673d9c734f4dae6182312c6050861ee824aed"} Apr 24 22:29:57.509947 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:57.509254 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-103.ec2.internal" podStartSLOduration=3.509237753 podStartE2EDuration="3.509237753s" podCreationTimestamp="2026-04-24 22:29:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:29:56.47589168 +0000 UTC m=+3.624197231" watchObservedRunningTime="2026-04-24 22:29:57.509237753 +0000 UTC m=+4.657543326" Apr 24 22:29:58.496946 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:58.496888 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-103.ec2.internal" event={"ID":"967fd0e774275622d49edbbb2962b348","Type":"ContainerStarted","Data":"ac553808c5b76f5f5f70c70ee9e816f55887cd61a2fd0f935ca07b0cd45b633a"} Apr 24 22:29:58.931391 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:58.930754 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edbc33b8-02e4-43d1-a683-6dcd726340b7-metrics-certs\") pod \"network-metrics-daemon-v4wwp\" (UID: \"edbc33b8-02e4-43d1-a683-6dcd726340b7\") " pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:29:58.931391 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:58.930939 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:58.931391 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:58.930995 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edbc33b8-02e4-43d1-a683-6dcd726340b7-metrics-certs podName:edbc33b8-02e4-43d1-a683-6dcd726340b7 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:02.930978204 +0000 UTC m=+10.079283735 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/edbc33b8-02e4-43d1-a683-6dcd726340b7-metrics-certs") pod "network-metrics-daemon-v4wwp" (UID: "edbc33b8-02e4-43d1-a683-6dcd726340b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:59.132690 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:59.132598 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtlkc\" (UniqueName: \"kubernetes.io/projected/6a2536b6-b046-4594-8305-33498ed4dadd-kube-api-access-mtlkc\") pod \"network-check-target-6q8zg\" (UID: \"6a2536b6-b046-4594-8305-33498ed4dadd\") " pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:29:59.132878 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:59.132777 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:59.132878 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:59.132797 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:59.132878 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:59.132811 2572 projected.go:194] Error preparing data for projected volume kube-api-access-mtlkc for pod openshift-network-diagnostics/network-check-target-6q8zg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:59.132878 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:59.132870 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6a2536b6-b046-4594-8305-33498ed4dadd-kube-api-access-mtlkc podName:6a2536b6-b046-4594-8305-33498ed4dadd nodeName:}" failed. No retries permitted until 2026-04-24 22:30:03.132850839 +0000 UTC m=+10.281156382 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-mtlkc" (UniqueName: "kubernetes.io/projected/6a2536b6-b046-4594-8305-33498ed4dadd-kube-api-access-mtlkc") pod "network-check-target-6q8zg" (UID: "6a2536b6-b046-4594-8305-33498ed4dadd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:59.416044 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:59.414086 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:29:59.416044 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:59.414208 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6q8zg" podUID="6a2536b6-b046-4594-8305-33498ed4dadd" Apr 24 22:29:59.416044 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:29:59.414700 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:29:59.416044 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:29:59.414803 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4wwp" podUID="edbc33b8-02e4-43d1-a683-6dcd726340b7" Apr 24 22:30:01.413505 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:01.413436 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:01.413937 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:01.413566 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4wwp" podUID="edbc33b8-02e4-43d1-a683-6dcd726340b7" Apr 24 22:30:01.413937 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:01.413901 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:01.414082 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:01.413989 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6q8zg" podUID="6a2536b6-b046-4594-8305-33498ed4dadd" Apr 24 22:30:02.967177 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:02.967082 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edbc33b8-02e4-43d1-a683-6dcd726340b7-metrics-certs\") pod \"network-metrics-daemon-v4wwp\" (UID: \"edbc33b8-02e4-43d1-a683-6dcd726340b7\") " pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:02.967607 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:02.967209 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:02.967607 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:02.967285 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edbc33b8-02e4-43d1-a683-6dcd726340b7-metrics-certs podName:edbc33b8-02e4-43d1-a683-6dcd726340b7 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:10.967262868 +0000 UTC m=+18.115568397 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/edbc33b8-02e4-43d1-a683-6dcd726340b7-metrics-certs") pod "network-metrics-daemon-v4wwp" (UID: "edbc33b8-02e4-43d1-a683-6dcd726340b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:03.168627 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:03.168593 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtlkc\" (UniqueName: \"kubernetes.io/projected/6a2536b6-b046-4594-8305-33498ed4dadd-kube-api-access-mtlkc\") pod \"network-check-target-6q8zg\" (UID: \"6a2536b6-b046-4594-8305-33498ed4dadd\") " pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:03.169226 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:03.168783 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:03.169226 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:03.168810 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:03.169226 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:03.168824 2572 projected.go:194] Error preparing data for projected volume kube-api-access-mtlkc for pod openshift-network-diagnostics/network-check-target-6q8zg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:03.169226 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:03.168884 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6a2536b6-b046-4594-8305-33498ed4dadd-kube-api-access-mtlkc podName:6a2536b6-b046-4594-8305-33498ed4dadd nodeName:}" failed. No retries permitted until 2026-04-24 22:30:11.16886553 +0000 UTC m=+18.317171061 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-mtlkc" (UniqueName: "kubernetes.io/projected/6a2536b6-b046-4594-8305-33498ed4dadd-kube-api-access-mtlkc") pod "network-check-target-6q8zg" (UID: "6a2536b6-b046-4594-8305-33498ed4dadd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:03.414830 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:03.414143 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:03.414830 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:03.414254 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6q8zg" podUID="6a2536b6-b046-4594-8305-33498ed4dadd" Apr 24 22:30:03.414830 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:03.414637 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:03.414830 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:03.414740 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4wwp" podUID="edbc33b8-02e4-43d1-a683-6dcd726340b7" Apr 24 22:30:05.413336 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:05.413304 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:05.413779 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:05.413314 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:05.413779 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:05.413415 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6q8zg" podUID="6a2536b6-b046-4594-8305-33498ed4dadd" Apr 24 22:30:05.413779 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:05.413521 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4wwp" podUID="edbc33b8-02e4-43d1-a683-6dcd726340b7" Apr 24 22:30:07.413302 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:07.413273 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:07.413775 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:07.413310 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:07.413775 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:07.413387 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6q8zg" podUID="6a2536b6-b046-4594-8305-33498ed4dadd" Apr 24 22:30:07.413775 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:07.413478 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4wwp" podUID="edbc33b8-02e4-43d1-a683-6dcd726340b7" Apr 24 22:30:09.413951 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:09.413917 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:09.414464 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:09.413917 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:09.414464 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:09.414052 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6q8zg" podUID="6a2536b6-b046-4594-8305-33498ed4dadd" Apr 24 22:30:09.414464 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:09.414146 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4wwp" podUID="edbc33b8-02e4-43d1-a683-6dcd726340b7" Apr 24 22:30:11.033627 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:11.033594 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edbc33b8-02e4-43d1-a683-6dcd726340b7-metrics-certs\") pod \"network-metrics-daemon-v4wwp\" (UID: \"edbc33b8-02e4-43d1-a683-6dcd726340b7\") " pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:11.034126 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:11.033715 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:11.034126 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:11.033769 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edbc33b8-02e4-43d1-a683-6dcd726340b7-metrics-certs podName:edbc33b8-02e4-43d1-a683-6dcd726340b7 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:27.03375514 +0000 UTC m=+34.182060666 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/edbc33b8-02e4-43d1-a683-6dcd726340b7-metrics-certs") pod "network-metrics-daemon-v4wwp" (UID: "edbc33b8-02e4-43d1-a683-6dcd726340b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:11.235683 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:11.235640 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtlkc\" (UniqueName: \"kubernetes.io/projected/6a2536b6-b046-4594-8305-33498ed4dadd-kube-api-access-mtlkc\") pod \"network-check-target-6q8zg\" (UID: \"6a2536b6-b046-4594-8305-33498ed4dadd\") " pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:11.235839 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:11.235799 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:11.235839 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:11.235821 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:11.235839 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:11.235831 2572 projected.go:194] Error preparing data for projected volume kube-api-access-mtlkc for pod openshift-network-diagnostics/network-check-target-6q8zg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:11.235971 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:11.235879 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6a2536b6-b046-4594-8305-33498ed4dadd-kube-api-access-mtlkc podName:6a2536b6-b046-4594-8305-33498ed4dadd nodeName:}" failed. No retries permitted until 2026-04-24 22:30:27.235864857 +0000 UTC m=+34.384170382 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-mtlkc" (UniqueName: "kubernetes.io/projected/6a2536b6-b046-4594-8305-33498ed4dadd-kube-api-access-mtlkc") pod "network-check-target-6q8zg" (UID: "6a2536b6-b046-4594-8305-33498ed4dadd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:11.413491 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:11.413423 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:11.413614 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:11.413546 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6q8zg" podUID="6a2536b6-b046-4594-8305-33498ed4dadd" Apr 24 22:30:11.413614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:11.413592 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:11.413713 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:11.413679 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4wwp" podUID="edbc33b8-02e4-43d1-a683-6dcd726340b7" Apr 24 22:30:13.414311 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:13.414149 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:13.415074 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:13.414355 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6q8zg" podUID="6a2536b6-b046-4594-8305-33498ed4dadd" Apr 24 22:30:13.415074 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:13.414218 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:13.415074 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:13.414453 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4wwp" podUID="edbc33b8-02e4-43d1-a683-6dcd726340b7" Apr 24 22:30:13.525147 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:13.524978 2572 generic.go:358] "Generic (PLEG): container finished" podID="91b6537a-5c00-4286-854b-be48eb427fe2" containerID="6eb845cb272005a8a8c7a5628956218cb43596c2ea5cfe68a7767ca99e83265f" exitCode=0 Apr 24 22:30:13.525257 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:13.525045 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l57fj" event={"ID":"91b6537a-5c00-4286-854b-be48eb427fe2","Type":"ContainerDied","Data":"6eb845cb272005a8a8c7a5628956218cb43596c2ea5cfe68a7767ca99e83265f"} Apr 24 22:30:13.527266 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:13.527230 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pqcxg" event={"ID":"987422f7-15db-4df1-8e08-87491e9648dd","Type":"ContainerStarted","Data":"a017c4f03e90ff4673d5202c8bb56de8cc1137b870f237342133847953663af7"} Apr 24 22:30:13.529841 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:13.529815 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" event={"ID":"6126be90-f084-4237-b144-cdf6cef066c9","Type":"ContainerStarted","Data":"61749ad3435bb6c03a6faf56be4f89c12bbce510f12d02bac93dc17027235adf"} Apr 24 22:30:13.529931 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:13.529844 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" event={"ID":"6126be90-f084-4237-b144-cdf6cef066c9","Type":"ContainerStarted","Data":"81cb92e7c12a859346d6153ae2bba62e1d9142fff5761d69e8c41827b68b3782"} Apr 24 22:30:13.531410 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:13.531383 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d44dl" event={"ID":"e66ee62e-3c52-4dc3-b9b5-336aaebbd397","Type":"ContainerStarted","Data":"170b2e60f8279970705ea58f50741bacefe491fe7ce7702f6204a29d9e5269fa"} Apr 24 22:30:13.533145 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:13.533123 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qgt9r" event={"ID":"1e6d3a55-047e-4c45-a923-7fa5ce12912c","Type":"ContainerStarted","Data":"9aeeabe82e1f3abdb956d43d8ac0092ad981b3d493a0554ed186954da28066ee"} Apr 24 22:30:13.534486 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:13.534459 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" event={"ID":"221dffa5-7757-4e46-94b6-caf50b41f29e","Type":"ContainerStarted","Data":"2ac1418ab633f8d96802b36507e4db93014d28e0d7080b2e6bc4040ba89cfe18"} Apr 24 22:30:13.535916 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:13.535879 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t9drx" event={"ID":"d399b0dc-b56e-4c25-8058-11c529fe99f7","Type":"ContainerStarted","Data":"f4417204e4aebf2ea1191267dad605e509340aab6bdb2e3560d74c9dbdad714c"} Apr 24 22:30:13.537923 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:13.537567 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" event={"ID":"b463ac55-0ee6-47ff-891e-bc4bf5b7184b","Type":"ContainerStarted","Data":"b77b15bb8927e945e07d40d2ba4f8929263bbc4870cd755b5f8801844f5fd473"} Apr 24 22:30:13.593303 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:13.593254 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-103.ec2.internal" podStartSLOduration=19.593237333 podStartE2EDuration="19.593237333s" podCreationTimestamp="2026-04-24 22:29:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:29:58.519659835 +0000 UTC m=+5.667965384" watchObservedRunningTime="2026-04-24 22:30:13.593237333 +0000 UTC m=+20.741542883" Apr 24 22:30:13.653774 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:13.653723 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-4hbpm" podStartSLOduration=3.882823566 podStartE2EDuration="20.653705292s" podCreationTimestamp="2026-04-24 22:29:53 +0000 UTC" firstStartedPulling="2026-04-24 22:29:55.969830046 +0000 UTC m=+3.118135578" lastFinishedPulling="2026-04-24 22:30:12.740711772 +0000 UTC m=+19.889017304" observedRunningTime="2026-04-24 22:30:13.652949716 +0000 UTC m=+20.801255267" watchObservedRunningTime="2026-04-24 22:30:13.653705292 +0000 UTC m=+20.802010841" Apr 24 22:30:13.750162 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:13.750056 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qgt9r" podStartSLOduration=3.955941576 podStartE2EDuration="20.750042514s" podCreationTimestamp="2026-04-24 22:29:53 +0000 UTC" firstStartedPulling="2026-04-24 22:29:55.971898425 +0000 UTC m=+3.120203952" lastFinishedPulling="2026-04-24 22:30:12.765999349 +0000 UTC m=+19.914304890" observedRunningTime="2026-04-24 22:30:13.714696849 +0000 UTC m=+20.863002407" watchObservedRunningTime="2026-04-24 22:30:13.750042514 +0000 UTC m=+20.898348062" Apr 24 22:30:13.814646 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:13.814610 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-d44dl" podStartSLOduration=4.048536354 podStartE2EDuration="20.814598085s" podCreationTimestamp="2026-04-24 22:29:53 +0000 UTC" firstStartedPulling="2026-04-24 22:29:55.974646 +0000 UTC m=+3.122951527" lastFinishedPulling="2026-04-24 22:30:12.740707717 +0000 UTC m=+19.889013258" observedRunningTime="2026-04-24 22:30:13.750445914 +0000 UTC m=+20.898751462" watchObservedRunningTime="2026-04-24 22:30:13.814598085 +0000 UTC m=+20.962903633" Apr 24 22:30:13.814722 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:13.814703 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-t9drx" podStartSLOduration=3.996189912 podStartE2EDuration="20.814699813s" podCreationTimestamp="2026-04-24 22:29:53 +0000 UTC" firstStartedPulling="2026-04-24 22:29:55.96637626 +0000 UTC m=+3.114681785" lastFinishedPulling="2026-04-24 22:30:12.784886157 +0000 UTC m=+19.933191686" observedRunningTime="2026-04-24 22:30:13.812904395 +0000 UTC m=+20.961209942" watchObservedRunningTime="2026-04-24 22:30:13.814699813 +0000 UTC m=+20.963005376" Apr 24 22:30:13.851418 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:13.851373 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-pqcxg" podStartSLOduration=4.083511233 podStartE2EDuration="20.851356321s" podCreationTimestamp="2026-04-24 22:29:53 +0000 UTC" firstStartedPulling="2026-04-24 22:29:55.973109376 +0000 UTC m=+3.121414907" lastFinishedPulling="2026-04-24 22:30:12.740954468 +0000 UTC m=+19.889259995" observedRunningTime="2026-04-24 22:30:13.849530475 +0000 UTC m=+20.997836024" watchObservedRunningTime="2026-04-24 22:30:13.851356321 +0000 UTC m=+20.999661871" Apr 24 22:30:13.954356 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:13.954307 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 22:30:14.367842 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:14.367692 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T22:30:13.954328096Z","UUID":"eb008c7d-8984-441f-9f68-7e7cfb44c00e","Handler":null,"Name":"","Endpoint":""} Apr 24 22:30:14.369992 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:14.369969 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 22:30:14.370143 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:14.370031 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 22:30:14.541773 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:14.541731 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" event={"ID":"b463ac55-0ee6-47ff-891e-bc4bf5b7184b","Type":"ContainerStarted","Data":"87cc23b7edec067ff47df49b12d0d1d851958482eed4b56d73109cabdda7b47d"} Apr 24 22:30:14.543535 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:14.543505 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rsk2h" event={"ID":"05911b94-3317-4b23-b34f-f95d4552f61e","Type":"ContainerStarted","Data":"285b7f3c2d98b4e383307549a1266d9a7d19bd7c9cb3175b0ce55f3fc24b9f98"} Apr 24 22:30:14.546659 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:14.546632 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" event={"ID":"6126be90-f084-4237-b144-cdf6cef066c9","Type":"ContainerStarted","Data":"fdadf407c1bd737fc0db6711dc2feb434db7ddcc9b2e24210326390c50886dbd"} Apr 24 22:30:14.546659 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:14.546666 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" event={"ID":"6126be90-f084-4237-b144-cdf6cef066c9","Type":"ContainerStarted","Data":"5b83820d528d5ba653b157bc8b679c335a06098b2a7aec83637f97c05c7db1db"} Apr 24 22:30:14.546823 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:14.546681 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" event={"ID":"6126be90-f084-4237-b144-cdf6cef066c9","Type":"ContainerStarted","Data":"24e455d53879328044d50f22e6718155afab11b5ef161964f2c2ba7b61547ebe"} Apr 24 22:30:14.546823 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:14.546694 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" event={"ID":"6126be90-f084-4237-b144-cdf6cef066c9","Type":"ContainerStarted","Data":"e6e454bb477bf4f94f62db101b599f7220d4f67a279a8293a6d0af9f19dc30fc"} Apr 24 22:30:14.563385 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:14.563341 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-rsk2h" podStartSLOduration=4.762477643 podStartE2EDuration="21.563327523s" podCreationTimestamp="2026-04-24 22:29:53 +0000 UTC" firstStartedPulling="2026-04-24 22:29:55.969245343 +0000 UTC m=+3.117550870" lastFinishedPulling="2026-04-24 22:30:12.77009521 +0000 UTC m=+19.918400750" observedRunningTime="2026-04-24 22:30:14.563271017 +0000 UTC m=+21.711576589" watchObservedRunningTime="2026-04-24 22:30:14.563327523 +0000 UTC m=+21.711633071" Apr 24 22:30:14.953554 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:14.953500 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-pqcxg" Apr 24 22:30:14.954146 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:14.954126 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-pqcxg" Apr 24 22:30:15.413860 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:15.413651 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:15.414146 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:15.413718 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:15.414146 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:15.413925 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6q8zg" podUID="6a2536b6-b046-4594-8305-33498ed4dadd" Apr 24 22:30:15.414146 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:15.414026 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4wwp" podUID="edbc33b8-02e4-43d1-a683-6dcd726340b7" Apr 24 22:30:15.550379 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:15.550349 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" event={"ID":"b463ac55-0ee6-47ff-891e-bc4bf5b7184b","Type":"ContainerStarted","Data":"6a75292f0c584c7e02e1b4a70a58e1fb75c9c74e81a4caaf3445fddfbed48869"} Apr 24 22:30:15.550807 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:15.550524 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-pqcxg" Apr 24 22:30:15.550983 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:15.550966 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-pqcxg" Apr 24 22:30:15.600561 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:15.600521 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8frbj" podStartSLOduration=3.779096531 podStartE2EDuration="22.600511321s" podCreationTimestamp="2026-04-24 22:29:53 +0000 UTC" firstStartedPulling="2026-04-24 22:29:55.962091401 +0000 UTC m=+3.110396927" lastFinishedPulling="2026-04-24 22:30:14.783506191 +0000 UTC m=+21.931811717" observedRunningTime="2026-04-24 22:30:15.600279164 +0000 UTC m=+22.748584712" watchObservedRunningTime="2026-04-24 22:30:15.600511321 +0000 UTC m=+22.748816868" Apr 24 22:30:15.856105 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:15.856034 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-d44dl_e66ee62e-3c52-4dc3-b9b5-336aaebbd397/dns-node-resolver/0.log" Apr 24 22:30:17.022881 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:17.022851 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qgt9r_1e6d3a55-047e-4c45-a923-7fa5ce12912c/node-ca/0.log" Apr 24 22:30:17.415905 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:17.415848 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:17.416063 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:17.415848 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:17.416063 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:17.415933 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6q8zg" podUID="6a2536b6-b046-4594-8305-33498ed4dadd" Apr 24 22:30:17.416063 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:17.416026 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4wwp" podUID="edbc33b8-02e4-43d1-a683-6dcd726340b7" Apr 24 22:30:17.557661 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:17.557630 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" event={"ID":"6126be90-f084-4237-b144-cdf6cef066c9","Type":"ContainerStarted","Data":"3da88d953e0a180cafb75516bc69c9ed8fff80f0098a9cc3be5656dfc5a5f6b0"} Apr 24 22:30:18.561560 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:18.561517 2572 generic.go:358] "Generic (PLEG): container finished" podID="91b6537a-5c00-4286-854b-be48eb427fe2" containerID="b8186d5c632286861af6111aa941957698e09990bb19c8cd7ce2ab78186b53b4" exitCode=0 Apr 24 22:30:18.562280 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:18.561612 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l57fj" event={"ID":"91b6537a-5c00-4286-854b-be48eb427fe2","Type":"ContainerDied","Data":"b8186d5c632286861af6111aa941957698e09990bb19c8cd7ce2ab78186b53b4"} Apr 24 22:30:19.416377 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:19.416317 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:19.416480 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:19.416317 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:19.416480 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:19.416446 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4wwp" podUID="edbc33b8-02e4-43d1-a683-6dcd726340b7" Apr 24 22:30:19.416480 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:19.416471 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6q8zg" podUID="6a2536b6-b046-4594-8305-33498ed4dadd" Apr 24 22:30:19.564791 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:19.564684 2572 generic.go:358] "Generic (PLEG): container finished" podID="91b6537a-5c00-4286-854b-be48eb427fe2" containerID="56e5937bfdc7f5439e181ebb71cf0667ef1fa32954563561df7128368de18d2b" exitCode=0 Apr 24 22:30:19.564791 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:19.564770 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l57fj" event={"ID":"91b6537a-5c00-4286-854b-be48eb427fe2","Type":"ContainerDied","Data":"56e5937bfdc7f5439e181ebb71cf0667ef1fa32954563561df7128368de18d2b"} Apr 24 22:30:20.568482 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:20.568407 2572 generic.go:358] "Generic (PLEG): container finished" podID="91b6537a-5c00-4286-854b-be48eb427fe2" containerID="6729825ca1ea0964892422076a581c6add5c1f3433b09a56fd7722f2edfa5fa6" exitCode=0 Apr 24 22:30:20.568840 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:20.568493 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l57fj" event={"ID":"91b6537a-5c00-4286-854b-be48eb427fe2","Type":"ContainerDied","Data":"6729825ca1ea0964892422076a581c6add5c1f3433b09a56fd7722f2edfa5fa6"} Apr 24 22:30:20.572070 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:20.572045 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" event={"ID":"6126be90-f084-4237-b144-cdf6cef066c9","Type":"ContainerStarted","Data":"8bd6e29160e958392fde3709463aef9aa102d400dd80148e04aefb5f27349ca3"} Apr 24 22:30:20.572371 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:20.572351 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:30:20.572453 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:20.572380 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:30:20.572453 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:20.572393 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:30:20.587112 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:20.587088 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:30:20.587331 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:20.587316 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:30:21.416440 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:21.416407 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:21.416583 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:21.416407 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:21.416583 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:21.416497 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6q8zg" podUID="6a2536b6-b046-4594-8305-33498ed4dadd" Apr 24 22:30:21.416683 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:21.416588 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4wwp" podUID="edbc33b8-02e4-43d1-a683-6dcd726340b7" Apr 24 22:30:23.416124 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:23.416091 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:23.416514 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:23.416091 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:23.416514 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:23.416201 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4wwp" podUID="edbc33b8-02e4-43d1-a683-6dcd726340b7" Apr 24 22:30:23.416514 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:23.416273 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6q8zg" podUID="6a2536b6-b046-4594-8305-33498ed4dadd" Apr 24 22:30:25.413703 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:25.413675 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:25.414133 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:25.413769 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6q8zg" podUID="6a2536b6-b046-4594-8305-33498ed4dadd" Apr 24 22:30:25.414133 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:25.413827 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:25.414133 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:25.413943 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4wwp" podUID="edbc33b8-02e4-43d1-a683-6dcd726340b7" Apr 24 22:30:27.047041 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:27.046998 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edbc33b8-02e4-43d1-a683-6dcd726340b7-metrics-certs\") pod \"network-metrics-daemon-v4wwp\" (UID: \"edbc33b8-02e4-43d1-a683-6dcd726340b7\") " pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:27.047375 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:27.047159 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:27.047375 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:27.047225 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edbc33b8-02e4-43d1-a683-6dcd726340b7-metrics-certs podName:edbc33b8-02e4-43d1-a683-6dcd726340b7 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:59.047204756 +0000 UTC m=+66.195510281 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/edbc33b8-02e4-43d1-a683-6dcd726340b7-metrics-certs") pod "network-metrics-daemon-v4wwp" (UID: "edbc33b8-02e4-43d1-a683-6dcd726340b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:27.248560 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:27.248529 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtlkc\" (UniqueName: \"kubernetes.io/projected/6a2536b6-b046-4594-8305-33498ed4dadd-kube-api-access-mtlkc\") pod \"network-check-target-6q8zg\" (UID: \"6a2536b6-b046-4594-8305-33498ed4dadd\") " pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:27.248702 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:27.248634 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:27.248702 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:27.248649 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:27.248702 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:27.248659 2572 projected.go:194] Error preparing data for projected volume kube-api-access-mtlkc for pod openshift-network-diagnostics/network-check-target-6q8zg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:27.248803 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:27.248705 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6a2536b6-b046-4594-8305-33498ed4dadd-kube-api-access-mtlkc podName:6a2536b6-b046-4594-8305-33498ed4dadd nodeName:}" failed. No retries permitted until 2026-04-24 22:30:59.248691426 +0000 UTC m=+66.396996951 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-mtlkc" (UniqueName: "kubernetes.io/projected/6a2536b6-b046-4594-8305-33498ed4dadd-kube-api-access-mtlkc") pod "network-check-target-6q8zg" (UID: "6a2536b6-b046-4594-8305-33498ed4dadd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:27.414155 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:27.414101 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:27.414155 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:27.414153 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:27.414276 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:27.414242 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4wwp" podUID="edbc33b8-02e4-43d1-a683-6dcd726340b7" Apr 24 22:30:27.414461 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:27.414425 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6q8zg" podUID="6a2536b6-b046-4594-8305-33498ed4dadd" Apr 24 22:30:27.592670 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:27.592640 2572 generic.go:358] "Generic (PLEG): container finished" podID="91b6537a-5c00-4286-854b-be48eb427fe2" containerID="ab3fa6d8744cf46ca750256d5454ce86fa109642b421b29f16100f72630cc88d" exitCode=0 Apr 24 22:30:27.592758 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:27.592672 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l57fj" event={"ID":"91b6537a-5c00-4286-854b-be48eb427fe2","Type":"ContainerDied","Data":"ab3fa6d8744cf46ca750256d5454ce86fa109642b421b29f16100f72630cc88d"} Apr 24 22:30:27.622502 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:27.622458 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" podStartSLOduration=17.311067259 podStartE2EDuration="34.622446364s" podCreationTimestamp="2026-04-24 22:29:53 +0000 UTC" firstStartedPulling="2026-04-24 22:29:55.963286942 +0000 UTC m=+3.111592468" lastFinishedPulling="2026-04-24 22:30:13.274666044 +0000 UTC m=+20.422971573" observedRunningTime="2026-04-24 22:30:20.669314424 +0000 UTC m=+27.817619972" watchObservedRunningTime="2026-04-24 22:30:27.622446364 +0000 UTC m=+34.770751912" Apr 24 22:30:28.597992 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:28.597961 2572 generic.go:358] "Generic (PLEG): container finished" podID="91b6537a-5c00-4286-854b-be48eb427fe2" containerID="af2fc717a7bc7a2ccf8c2fc81f0d741bbfbc7955e2b99236b842a02f685cbf6e" exitCode=0 Apr 24 22:30:28.598377 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:28.598049 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l57fj" event={"ID":"91b6537a-5c00-4286-854b-be48eb427fe2","Type":"ContainerDied","Data":"af2fc717a7bc7a2ccf8c2fc81f0d741bbfbc7955e2b99236b842a02f685cbf6e"} Apr 24 22:30:29.413629 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:29.413427 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:29.413761 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:29.413446 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:29.413761 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:29.413701 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6q8zg" podUID="6a2536b6-b046-4594-8305-33498ed4dadd" Apr 24 22:30:29.413840 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:29.413806 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4wwp" podUID="edbc33b8-02e4-43d1-a683-6dcd726340b7" Apr 24 22:30:29.603204 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:29.603177 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l57fj" event={"ID":"91b6537a-5c00-4286-854b-be48eb427fe2","Type":"ContainerStarted","Data":"e10abea7a019fb6d6703d5b37182a42b4164cd7a82f8a22916721868cccc5e2f"} Apr 24 22:30:29.635616 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:29.635570 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-l57fj" podStartSLOduration=6.1746365149999995 podStartE2EDuration="36.635559673s" podCreationTimestamp="2026-04-24 22:29:53 +0000 UTC" firstStartedPulling="2026-04-24 22:29:55.976595441 +0000 UTC m=+3.124900971" lastFinishedPulling="2026-04-24 22:30:26.437518603 +0000 UTC m=+33.585824129" observedRunningTime="2026-04-24 22:30:29.635288835 +0000 UTC m=+36.783594384" watchObservedRunningTime="2026-04-24 22:30:29.635559673 +0000 UTC m=+36.783865221" Apr 24 22:30:31.413890 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:31.413855 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:31.413890 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:31.413881 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:31.414343 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:31.413956 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6q8zg" podUID="6a2536b6-b046-4594-8305-33498ed4dadd" Apr 24 22:30:31.414343 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:31.414132 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4wwp" podUID="edbc33b8-02e4-43d1-a683-6dcd726340b7" Apr 24 22:30:33.415447 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:33.414454 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:33.415447 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:33.414872 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:33.415447 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:33.415007 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4wwp" podUID="edbc33b8-02e4-43d1-a683-6dcd726340b7" Apr 24 22:30:33.416210 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:33.414612 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6q8zg" podUID="6a2536b6-b046-4594-8305-33498ed4dadd" Apr 24 22:30:35.416577 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:35.416549 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:35.416971 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:35.416551 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:35.416971 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:35.416657 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4wwp" podUID="edbc33b8-02e4-43d1-a683-6dcd726340b7" Apr 24 22:30:35.416971 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:35.416719 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6q8zg" podUID="6a2536b6-b046-4594-8305-33498ed4dadd" Apr 24 22:30:37.413617 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:37.413586 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:37.414050 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:37.413586 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:37.414050 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:37.413689 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6q8zg" podUID="6a2536b6-b046-4594-8305-33498ed4dadd" Apr 24 22:30:37.414050 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:37.413764 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4wwp" podUID="edbc33b8-02e4-43d1-a683-6dcd726340b7" Apr 24 22:30:38.054690 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:38.054664 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6q8zg"] Apr 24 22:30:38.054814 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:38.054777 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:38.054874 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:38.054853 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6q8zg" podUID="6a2536b6-b046-4594-8305-33498ed4dadd" Apr 24 22:30:38.057714 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:38.057657 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-v4wwp"] Apr 24 22:30:38.057829 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:38.057779 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:38.057889 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:38.057870 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4wwp" podUID="edbc33b8-02e4-43d1-a683-6dcd726340b7" Apr 24 22:30:39.413804 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:39.413772 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:39.414678 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:39.413883 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6q8zg" podUID="6a2536b6-b046-4594-8305-33498ed4dadd" Apr 24 22:30:40.413941 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:40.413913 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:40.414268 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:40.414008 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v4wwp" podUID="edbc33b8-02e4-43d1-a683-6dcd726340b7" Apr 24 22:30:41.413968 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.413932 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:41.414371 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:30:41.414054 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6q8zg" podUID="6a2536b6-b046-4594-8305-33498ed4dadd" Apr 24 22:30:41.635274 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.635253 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-103.ec2.internal" event="NodeReady" Apr 24 22:30:41.635367 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.635338 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 22:30:41.724837 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.724784 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vv7c4"] Apr 24 22:30:41.752489 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.752470 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vv7c4"] Apr 24 22:30:41.752598 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.752562 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vv7c4" Apr 24 22:30:41.755494 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.755474 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 22:30:41.755712 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.755689 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 22:30:41.755856 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.755832 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qwslk\"" Apr 24 22:30:41.764949 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.764929 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-c9c2z"] Apr 24 22:30:41.791437 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.791410 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-c9c2z"] Apr 24 22:30:41.791552 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.791534 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-c9c2z" Apr 24 22:30:41.797345 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.797332 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 22:30:41.797523 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.797508 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 22:30:41.797599 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.797573 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 22:30:41.798489 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.798463 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qbjm6\"" Apr 24 22:30:41.798769 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.798755 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 22:30:41.825526 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.825510 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8swwq"] Apr 24 22:30:41.846628 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.846604 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8swwq"] Apr 24 22:30:41.846700 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.846678 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8swwq" Apr 24 22:30:41.850286 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.850265 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 22:30:41.850421 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.850388 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 22:30:41.850501 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.850433 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-x6wvt\"" Apr 24 22:30:41.850800 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.850779 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 22:30:41.860226 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.860203 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bbc078c-a636-438c-a64a-5bdfc1d13816-config-volume\") pod \"dns-default-vv7c4\" (UID: \"1bbc078c-a636-438c-a64a-5bdfc1d13816\") " pod="openshift-dns/dns-default-vv7c4" Apr 24 22:30:41.860298 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.860227 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bs8d\" (UniqueName: \"kubernetes.io/projected/1bbc078c-a636-438c-a64a-5bdfc1d13816-kube-api-access-4bs8d\") pod \"dns-default-vv7c4\" (UID: \"1bbc078c-a636-438c-a64a-5bdfc1d13816\") " pod="openshift-dns/dns-default-vv7c4" Apr 24 22:30:41.860298 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.860244 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1bbc078c-a636-438c-a64a-5bdfc1d13816-tmp-dir\") pod \"dns-default-vv7c4\" (UID: \"1bbc078c-a636-438c-a64a-5bdfc1d13816\") " pod="openshift-dns/dns-default-vv7c4" Apr 24 22:30:41.860298 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.860263 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1bbc078c-a636-438c-a64a-5bdfc1d13816-metrics-tls\") pod \"dns-default-vv7c4\" (UID: \"1bbc078c-a636-438c-a64a-5bdfc1d13816\") " pod="openshift-dns/dns-default-vv7c4" Apr 24 22:30:41.961510 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.961487 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8bfa88df-1892-46d6-b7cb-6e1a3ae2e536-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c9c2z\" (UID: \"8bfa88df-1892-46d6-b7cb-6e1a3ae2e536\") " pod="openshift-insights/insights-runtime-extractor-c9c2z" Apr 24 22:30:41.961592 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.961519 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvhzn\" (UniqueName: \"kubernetes.io/projected/8bfa88df-1892-46d6-b7cb-6e1a3ae2e536-kube-api-access-hvhzn\") pod \"insights-runtime-extractor-c9c2z\" (UID: \"8bfa88df-1892-46d6-b7cb-6e1a3ae2e536\") " pod="openshift-insights/insights-runtime-extractor-c9c2z" Apr 24 22:30:41.961592 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.961555 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bbc078c-a636-438c-a64a-5bdfc1d13816-config-volume\") pod \"dns-default-vv7c4\" (UID: \"1bbc078c-a636-438c-a64a-5bdfc1d13816\") " pod="openshift-dns/dns-default-vv7c4" Apr 24 22:30:41.961592 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.961572 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bs8d\" (UniqueName: \"kubernetes.io/projected/1bbc078c-a636-438c-a64a-5bdfc1d13816-kube-api-access-4bs8d\") pod \"dns-default-vv7c4\" (UID: \"1bbc078c-a636-438c-a64a-5bdfc1d13816\") " pod="openshift-dns/dns-default-vv7c4" Apr 24 22:30:41.961592 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.961589 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8bfa88df-1892-46d6-b7cb-6e1a3ae2e536-crio-socket\") pod \"insights-runtime-extractor-c9c2z\" (UID: \"8bfa88df-1892-46d6-b7cb-6e1a3ae2e536\") " pod="openshift-insights/insights-runtime-extractor-c9c2z" Apr 24 22:30:41.961760 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.961604 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8bfa88df-1892-46d6-b7cb-6e1a3ae2e536-data-volume\") pod \"insights-runtime-extractor-c9c2z\" (UID: \"8bfa88df-1892-46d6-b7cb-6e1a3ae2e536\") " pod="openshift-insights/insights-runtime-extractor-c9c2z" Apr 24 22:30:41.961760 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.961623 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8bfa88df-1892-46d6-b7cb-6e1a3ae2e536-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c9c2z\" (UID: \"8bfa88df-1892-46d6-b7cb-6e1a3ae2e536\") " pod="openshift-insights/insights-runtime-extractor-c9c2z" Apr 24 22:30:41.961760 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.961643 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1bbc078c-a636-438c-a64a-5bdfc1d13816-tmp-dir\") pod \"dns-default-vv7c4\" (UID: \"1bbc078c-a636-438c-a64a-5bdfc1d13816\") " pod="openshift-dns/dns-default-vv7c4" Apr 24 22:30:41.961760 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.961662 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1bbc078c-a636-438c-a64a-5bdfc1d13816-metrics-tls\") pod \"dns-default-vv7c4\" (UID: \"1bbc078c-a636-438c-a64a-5bdfc1d13816\") " pod="openshift-dns/dns-default-vv7c4" Apr 24 22:30:41.961760 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.961686 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d1b586a-9b17-4bf0-9aae-512fce173232-cert\") pod \"ingress-canary-8swwq\" (UID: \"5d1b586a-9b17-4bf0-9aae-512fce173232\") " pod="openshift-ingress-canary/ingress-canary-8swwq" Apr 24 22:30:41.961955 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.961776 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lj6c\" (UniqueName: \"kubernetes.io/projected/5d1b586a-9b17-4bf0-9aae-512fce173232-kube-api-access-5lj6c\") pod \"ingress-canary-8swwq\" (UID: \"5d1b586a-9b17-4bf0-9aae-512fce173232\") " pod="openshift-ingress-canary/ingress-canary-8swwq" Apr 24 22:30:41.961955 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.961931 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1bbc078c-a636-438c-a64a-5bdfc1d13816-tmp-dir\") pod \"dns-default-vv7c4\" (UID: \"1bbc078c-a636-438c-a64a-5bdfc1d13816\") " pod="openshift-dns/dns-default-vv7c4" Apr 24 22:30:41.962263 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.962109 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bbc078c-a636-438c-a64a-5bdfc1d13816-config-volume\") pod \"dns-default-vv7c4\" (UID: \"1bbc078c-a636-438c-a64a-5bdfc1d13816\") " pod="openshift-dns/dns-default-vv7c4" Apr 24 22:30:41.965926 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.965902 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1bbc078c-a636-438c-a64a-5bdfc1d13816-metrics-tls\") pod \"dns-default-vv7c4\" (UID: \"1bbc078c-a636-438c-a64a-5bdfc1d13816\") " pod="openshift-dns/dns-default-vv7c4" Apr 24 22:30:41.973758 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:41.973736 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bs8d\" (UniqueName: \"kubernetes.io/projected/1bbc078c-a636-438c-a64a-5bdfc1d13816-kube-api-access-4bs8d\") pod \"dns-default-vv7c4\" (UID: \"1bbc078c-a636-438c-a64a-5bdfc1d13816\") " pod="openshift-dns/dns-default-vv7c4" Apr 24 22:30:42.061643 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.061621 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vv7c4" Apr 24 22:30:42.061999 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.061980 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8bfa88df-1892-46d6-b7cb-6e1a3ae2e536-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c9c2z\" (UID: \"8bfa88df-1892-46d6-b7cb-6e1a3ae2e536\") " pod="openshift-insights/insights-runtime-extractor-c9c2z" Apr 24 22:30:42.062103 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.062028 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hvhzn\" (UniqueName: \"kubernetes.io/projected/8bfa88df-1892-46d6-b7cb-6e1a3ae2e536-kube-api-access-hvhzn\") pod \"insights-runtime-extractor-c9c2z\" (UID: \"8bfa88df-1892-46d6-b7cb-6e1a3ae2e536\") " pod="openshift-insights/insights-runtime-extractor-c9c2z" Apr 24 22:30:42.062103 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.062072 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8bfa88df-1892-46d6-b7cb-6e1a3ae2e536-crio-socket\") pod \"insights-runtime-extractor-c9c2z\" (UID: \"8bfa88df-1892-46d6-b7cb-6e1a3ae2e536\") " pod="openshift-insights/insights-runtime-extractor-c9c2z" Apr 24 22:30:42.062103 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.062088 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8bfa88df-1892-46d6-b7cb-6e1a3ae2e536-data-volume\") pod \"insights-runtime-extractor-c9c2z\" (UID: \"8bfa88df-1892-46d6-b7cb-6e1a3ae2e536\") " pod="openshift-insights/insights-runtime-extractor-c9c2z" Apr 24 22:30:42.062255 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.062110 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8bfa88df-1892-46d6-b7cb-6e1a3ae2e536-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c9c2z\" (UID: \"8bfa88df-1892-46d6-b7cb-6e1a3ae2e536\") " pod="openshift-insights/insights-runtime-extractor-c9c2z" Apr 24 22:30:42.062408 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.062320 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8bfa88df-1892-46d6-b7cb-6e1a3ae2e536-crio-socket\") pod \"insights-runtime-extractor-c9c2z\" (UID: \"8bfa88df-1892-46d6-b7cb-6e1a3ae2e536\") " pod="openshift-insights/insights-runtime-extractor-c9c2z" Apr 24 22:30:42.062408 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.062339 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d1b586a-9b17-4bf0-9aae-512fce173232-cert\") pod \"ingress-canary-8swwq\" (UID: \"5d1b586a-9b17-4bf0-9aae-512fce173232\") " pod="openshift-ingress-canary/ingress-canary-8swwq" Apr 24 22:30:42.062408 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.062378 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lj6c\" (UniqueName: \"kubernetes.io/projected/5d1b586a-9b17-4bf0-9aae-512fce173232-kube-api-access-5lj6c\") pod \"ingress-canary-8swwq\" (UID: \"5d1b586a-9b17-4bf0-9aae-512fce173232\") " pod="openshift-ingress-canary/ingress-canary-8swwq" Apr 24 22:30:42.062563 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.062517 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8bfa88df-1892-46d6-b7cb-6e1a3ae2e536-data-volume\") pod \"insights-runtime-extractor-c9c2z\" (UID: \"8bfa88df-1892-46d6-b7cb-6e1a3ae2e536\") " pod="openshift-insights/insights-runtime-extractor-c9c2z" Apr 24 22:30:42.062563 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.062522 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8bfa88df-1892-46d6-b7cb-6e1a3ae2e536-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c9c2z\" (UID: \"8bfa88df-1892-46d6-b7cb-6e1a3ae2e536\") " pod="openshift-insights/insights-runtime-extractor-c9c2z" Apr 24 22:30:42.064559 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.064533 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8bfa88df-1892-46d6-b7cb-6e1a3ae2e536-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c9c2z\" (UID: \"8bfa88df-1892-46d6-b7cb-6e1a3ae2e536\") " pod="openshift-insights/insights-runtime-extractor-c9c2z" Apr 24 22:30:42.064666 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.064596 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d1b586a-9b17-4bf0-9aae-512fce173232-cert\") pod \"ingress-canary-8swwq\" (UID: \"5d1b586a-9b17-4bf0-9aae-512fce173232\") " pod="openshift-ingress-canary/ingress-canary-8swwq" Apr 24 22:30:42.076581 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.076558 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lj6c\" (UniqueName: \"kubernetes.io/projected/5d1b586a-9b17-4bf0-9aae-512fce173232-kube-api-access-5lj6c\") pod \"ingress-canary-8swwq\" (UID: \"5d1b586a-9b17-4bf0-9aae-512fce173232\") " pod="openshift-ingress-canary/ingress-canary-8swwq" Apr 24 22:30:42.085331 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.085303 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvhzn\" (UniqueName: \"kubernetes.io/projected/8bfa88df-1892-46d6-b7cb-6e1a3ae2e536-kube-api-access-hvhzn\") pod \"insights-runtime-extractor-c9c2z\" (UID: \"8bfa88df-1892-46d6-b7cb-6e1a3ae2e536\") " pod="openshift-insights/insights-runtime-extractor-c9c2z" Apr 24 22:30:42.099165 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.099142 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-c9c2z" Apr 24 22:30:42.154876 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.154823 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8swwq" Apr 24 22:30:42.263140 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.263081 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vv7c4"] Apr 24 22:30:42.267568 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.267547 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-c9c2z"] Apr 24 22:30:42.274963 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:30:42.274935 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bbc078c_a636_438c_a64a_5bdfc1d13816.slice/crio-63323e44f519db73c272811fed15fc083ae9ea221b2afcb976e999f5f8f6ae45 WatchSource:0}: Error finding container 63323e44f519db73c272811fed15fc083ae9ea221b2afcb976e999f5f8f6ae45: Status 404 returned error can't find the container with id 63323e44f519db73c272811fed15fc083ae9ea221b2afcb976e999f5f8f6ae45 Apr 24 22:30:42.280795 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:30:42.280771 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bfa88df_1892_46d6_b7cb_6e1a3ae2e536.slice/crio-3d902440daa5661a173b864b431c51be05e946afb6d0660ff3d31e5e97de9501 WatchSource:0}: Error finding container 3d902440daa5661a173b864b431c51be05e946afb6d0660ff3d31e5e97de9501: Status 404 returned error can't find the container with id 3d902440daa5661a173b864b431c51be05e946afb6d0660ff3d31e5e97de9501 Apr 24 22:30:42.309780 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.309760 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8swwq"] Apr 24 22:30:42.315885 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:30:42.315856 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d1b586a_9b17_4bf0_9aae_512fce173232.slice/crio-262b88703eb8363b93ba3b2f7064b9ed139553f757baea05f395eef3b1eee983 WatchSource:0}: Error finding container 262b88703eb8363b93ba3b2f7064b9ed139553f757baea05f395eef3b1eee983: Status 404 returned error can't find the container with id 262b88703eb8363b93ba3b2f7064b9ed139553f757baea05f395eef3b1eee983 Apr 24 22:30:42.413945 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.413923 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:42.417279 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.417255 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-74npb\"" Apr 24 22:30:42.417588 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.417341 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 22:30:42.626249 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.626192 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8swwq" event={"ID":"5d1b586a-9b17-4bf0-9aae-512fce173232","Type":"ContainerStarted","Data":"262b88703eb8363b93ba3b2f7064b9ed139553f757baea05f395eef3b1eee983"} Apr 24 22:30:42.627274 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.627243 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vv7c4" event={"ID":"1bbc078c-a636-438c-a64a-5bdfc1d13816","Type":"ContainerStarted","Data":"63323e44f519db73c272811fed15fc083ae9ea221b2afcb976e999f5f8f6ae45"} Apr 24 22:30:42.628428 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.628405 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c9c2z" event={"ID":"8bfa88df-1892-46d6-b7cb-6e1a3ae2e536","Type":"ContainerStarted","Data":"6f8c1697b83fe721be24668231c6b89829a1c2ac48eefd9a549148d93142b273"} Apr 24 22:30:42.628503 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.628431 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c9c2z" event={"ID":"8bfa88df-1892-46d6-b7cb-6e1a3ae2e536","Type":"ContainerStarted","Data":"3d902440daa5661a173b864b431c51be05e946afb6d0660ff3d31e5e97de9501"} Apr 24 22:30:42.981640 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.981496 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-d49f586d7-2wfr7"] Apr 24 22:30:42.993782 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:42.993754 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d49f586d7-2wfr7" Apr 24 22:30:43.011636 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.011612 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 22:30:43.012098 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.012075 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 22:30:43.013460 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.012924 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 22:30:43.013460 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.012924 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 22:30:43.013460 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.013230 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 22:30:43.013460 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.013244 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 22:30:43.017556 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.017528 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 22:30:43.027327 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.027301 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d49f586d7-2wfr7"] Apr 24 22:30:43.057025 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.056830 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-zgzgn\"" Apr 24 22:30:43.170638 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.170616 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/749d05ed-a876-4e79-a759-a5f750366c90-oauth-serving-cert\") pod \"console-d49f586d7-2wfr7\" (UID: \"749d05ed-a876-4e79-a759-a5f750366c90\") " pod="openshift-console/console-d49f586d7-2wfr7" Apr 24 22:30:43.170773 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.170648 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/749d05ed-a876-4e79-a759-a5f750366c90-console-config\") pod \"console-d49f586d7-2wfr7\" (UID: \"749d05ed-a876-4e79-a759-a5f750366c90\") " pod="openshift-console/console-d49f586d7-2wfr7" Apr 24 22:30:43.170773 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.170710 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/749d05ed-a876-4e79-a759-a5f750366c90-console-oauth-config\") pod \"console-d49f586d7-2wfr7\" (UID: \"749d05ed-a876-4e79-a759-a5f750366c90\") " pod="openshift-console/console-d49f586d7-2wfr7" Apr 24 22:30:43.170773 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.170735 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/749d05ed-a876-4e79-a759-a5f750366c90-console-serving-cert\") pod \"console-d49f586d7-2wfr7\" (UID: \"749d05ed-a876-4e79-a759-a5f750366c90\") " pod="openshift-console/console-d49f586d7-2wfr7" Apr 24 22:30:43.170932 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.170776 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/749d05ed-a876-4e79-a759-a5f750366c90-service-ca\") pod \"console-d49f586d7-2wfr7\" (UID: \"749d05ed-a876-4e79-a759-a5f750366c90\") " pod="openshift-console/console-d49f586d7-2wfr7" Apr 24 22:30:43.170932 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.170800 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj56d\" (UniqueName: \"kubernetes.io/projected/749d05ed-a876-4e79-a759-a5f750366c90-kube-api-access-bj56d\") pod \"console-d49f586d7-2wfr7\" (UID: \"749d05ed-a876-4e79-a759-a5f750366c90\") " pod="openshift-console/console-d49f586d7-2wfr7" Apr 24 22:30:43.271567 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.271502 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bj56d\" (UniqueName: \"kubernetes.io/projected/749d05ed-a876-4e79-a759-a5f750366c90-kube-api-access-bj56d\") pod \"console-d49f586d7-2wfr7\" (UID: \"749d05ed-a876-4e79-a759-a5f750366c90\") " pod="openshift-console/console-d49f586d7-2wfr7" Apr 24 22:30:43.271567 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.271555 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/749d05ed-a876-4e79-a759-a5f750366c90-oauth-serving-cert\") pod \"console-d49f586d7-2wfr7\" (UID: \"749d05ed-a876-4e79-a759-a5f750366c90\") " pod="openshift-console/console-d49f586d7-2wfr7" Apr 24 22:30:43.271756 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.271593 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/749d05ed-a876-4e79-a759-a5f750366c90-console-config\") pod \"console-d49f586d7-2wfr7\" (UID: \"749d05ed-a876-4e79-a759-a5f750366c90\") " pod="openshift-console/console-d49f586d7-2wfr7" Apr 24 22:30:43.271756 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.271653 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/749d05ed-a876-4e79-a759-a5f750366c90-console-oauth-config\") pod \"console-d49f586d7-2wfr7\" (UID: \"749d05ed-a876-4e79-a759-a5f750366c90\") " pod="openshift-console/console-d49f586d7-2wfr7" Apr 24 22:30:43.271756 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.271681 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/749d05ed-a876-4e79-a759-a5f750366c90-console-serving-cert\") pod \"console-d49f586d7-2wfr7\" (UID: \"749d05ed-a876-4e79-a759-a5f750366c90\") " pod="openshift-console/console-d49f586d7-2wfr7" Apr 24 22:30:43.271756 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.271721 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/749d05ed-a876-4e79-a759-a5f750366c90-service-ca\") pod \"console-d49f586d7-2wfr7\" (UID: \"749d05ed-a876-4e79-a759-a5f750366c90\") " pod="openshift-console/console-d49f586d7-2wfr7" Apr 24 22:30:43.272680 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.272636 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/749d05ed-a876-4e79-a759-a5f750366c90-oauth-serving-cert\") pod \"console-d49f586d7-2wfr7\" (UID: \"749d05ed-a876-4e79-a759-a5f750366c90\") " pod="openshift-console/console-d49f586d7-2wfr7" Apr 24 22:30:43.272680 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.272658 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/749d05ed-a876-4e79-a759-a5f750366c90-console-config\") pod \"console-d49f586d7-2wfr7\" (UID: \"749d05ed-a876-4e79-a759-a5f750366c90\") " pod="openshift-console/console-d49f586d7-2wfr7" Apr 24 22:30:43.272819 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.272658 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/749d05ed-a876-4e79-a759-a5f750366c90-service-ca\") pod \"console-d49f586d7-2wfr7\" (UID: \"749d05ed-a876-4e79-a759-a5f750366c90\") " pod="openshift-console/console-d49f586d7-2wfr7" Apr 24 22:30:43.275823 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.275792 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/749d05ed-a876-4e79-a759-a5f750366c90-console-serving-cert\") pod \"console-d49f586d7-2wfr7\" (UID: \"749d05ed-a876-4e79-a759-a5f750366c90\") " pod="openshift-console/console-d49f586d7-2wfr7" Apr 24 22:30:43.275948 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.275919 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/749d05ed-a876-4e79-a759-a5f750366c90-console-oauth-config\") pod \"console-d49f586d7-2wfr7\" (UID: \"749d05ed-a876-4e79-a759-a5f750366c90\") " pod="openshift-console/console-d49f586d7-2wfr7" Apr 24 22:30:43.281622 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.281580 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj56d\" (UniqueName: \"kubernetes.io/projected/749d05ed-a876-4e79-a759-a5f750366c90-kube-api-access-bj56d\") pod \"console-d49f586d7-2wfr7\" (UID: \"749d05ed-a876-4e79-a759-a5f750366c90\") " pod="openshift-console/console-d49f586d7-2wfr7" Apr 24 22:30:43.305268 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.305208 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d49f586d7-2wfr7" Apr 24 22:30:43.417585 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.417256 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:43.421229 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.421207 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-499pd\"" Apr 24 22:30:43.421456 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.421437 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 22:30:43.421544 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:43.421471 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 22:30:44.263206 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:44.262962 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d49f586d7-2wfr7"] Apr 24 22:30:44.266231 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:30:44.266201 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod749d05ed_a876_4e79_a759_a5f750366c90.slice/crio-140e908c61a172337057487326c1b6b30da0dbb762946168bbd6d2b9f8db7833 WatchSource:0}: Error finding container 140e908c61a172337057487326c1b6b30da0dbb762946168bbd6d2b9f8db7833: Status 404 returned error can't find the container with id 140e908c61a172337057487326c1b6b30da0dbb762946168bbd6d2b9f8db7833 Apr 24 22:30:44.635440 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:44.635368 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c9c2z" event={"ID":"8bfa88df-1892-46d6-b7cb-6e1a3ae2e536","Type":"ContainerStarted","Data":"c157f5d3fb281b6b158702c986a8895735d5728a910b1196ac09b10176d17ac0"} Apr 24 22:30:44.636616 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:44.636586 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d49f586d7-2wfr7" event={"ID":"749d05ed-a876-4e79-a759-a5f750366c90","Type":"ContainerStarted","Data":"140e908c61a172337057487326c1b6b30da0dbb762946168bbd6d2b9f8db7833"} Apr 24 22:30:44.638235 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:44.638210 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8swwq" event={"ID":"5d1b586a-9b17-4bf0-9aae-512fce173232","Type":"ContainerStarted","Data":"9e07f5bcb993921fc8d8580c390a5348d238c5963469a48019ac69cc15e119ca"} Apr 24 22:30:44.640464 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:44.640446 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vv7c4" event={"ID":"1bbc078c-a636-438c-a64a-5bdfc1d13816","Type":"ContainerStarted","Data":"3507cfc0310df17d349e5cdc75d4b2d8b37cad217879aa4cfb52262bcd3cf296"} Apr 24 22:30:44.640569 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:44.640472 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vv7c4" event={"ID":"1bbc078c-a636-438c-a64a-5bdfc1d13816","Type":"ContainerStarted","Data":"fd65c26de43c647a58d52cb513379a1769f91913912c6e44b556f2038b1841e1"} Apr 24 22:30:44.640647 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:44.640628 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-vv7c4" Apr 24 22:30:44.664114 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:44.664073 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8swwq" podStartSLOduration=1.867504416 podStartE2EDuration="3.664058038s" podCreationTimestamp="2026-04-24 22:30:41 +0000 UTC" firstStartedPulling="2026-04-24 22:30:42.317593336 +0000 UTC m=+49.465898862" lastFinishedPulling="2026-04-24 22:30:44.114146949 +0000 UTC m=+51.262452484" observedRunningTime="2026-04-24 22:30:44.663376756 +0000 UTC m=+51.811682305" watchObservedRunningTime="2026-04-24 22:30:44.664058038 +0000 UTC m=+51.812363588" Apr 24 22:30:44.705894 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:44.705854 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vv7c4" podStartSLOduration=1.872721284 podStartE2EDuration="3.705838337s" podCreationTimestamp="2026-04-24 22:30:41 +0000 UTC" firstStartedPulling="2026-04-24 22:30:42.276433159 +0000 UTC m=+49.424738684" lastFinishedPulling="2026-04-24 22:30:44.109550208 +0000 UTC m=+51.257855737" observedRunningTime="2026-04-24 22:30:44.704787702 +0000 UTC m=+51.853093251" watchObservedRunningTime="2026-04-24 22:30:44.705838337 +0000 UTC m=+51.854143887" Apr 24 22:30:45.645992 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:45.645958 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c9c2z" event={"ID":"8bfa88df-1892-46d6-b7cb-6e1a3ae2e536","Type":"ContainerStarted","Data":"e6a6f26fad530647ad810b60120cdb14b31b70b819b6743926929e029a12fa1c"} Apr 24 22:30:47.652318 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:47.652285 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d49f586d7-2wfr7" event={"ID":"749d05ed-a876-4e79-a759-a5f750366c90","Type":"ContainerStarted","Data":"7bac731336e4c928c35107a66f027227fc31f9180ce235550c3efe9fcb581069"} Apr 24 22:30:47.681771 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:47.681729 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-c9c2z" podStartSLOduration=3.614777094 podStartE2EDuration="6.6817157s" podCreationTimestamp="2026-04-24 22:30:41 +0000 UTC" firstStartedPulling="2026-04-24 22:30:42.37538112 +0000 UTC m=+49.523686646" lastFinishedPulling="2026-04-24 22:30:45.442319724 +0000 UTC m=+52.590625252" observedRunningTime="2026-04-24 22:30:45.694879714 +0000 UTC m=+52.843185262" watchObservedRunningTime="2026-04-24 22:30:47.6817157 +0000 UTC m=+54.830021247" Apr 24 22:30:50.575187 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:50.575128 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-d49f586d7-2wfr7" podStartSLOduration=5.39926223 podStartE2EDuration="8.575108952s" podCreationTimestamp="2026-04-24 22:30:42 +0000 UTC" firstStartedPulling="2026-04-24 22:30:44.268448906 +0000 UTC m=+51.416754438" lastFinishedPulling="2026-04-24 22:30:47.444295631 +0000 UTC m=+54.592601160" observedRunningTime="2026-04-24 22:30:47.684881427 +0000 UTC m=+54.833186972" watchObservedRunningTime="2026-04-24 22:30:50.575108952 +0000 UTC m=+57.723414499" Apr 24 22:30:50.575674 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:50.575283 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-fcb68766d-9t7jp"] Apr 24 22:30:50.579738 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:50.579711 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:30:50.589875 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:50.589802 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 22:30:50.603143 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:50.603118 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-fcb68766d-9t7jp"] Apr 24 22:30:50.618666 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:50.618640 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v6l7\" (UniqueName: \"kubernetes.io/projected/a8437dad-a272-4baa-915e-fa0d39fb9fd2-kube-api-access-2v6l7\") pod \"console-fcb68766d-9t7jp\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:30:50.618767 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:50.618669 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a8437dad-a272-4baa-915e-fa0d39fb9fd2-console-oauth-config\") pod \"console-fcb68766d-9t7jp\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:30:50.618767 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:50.618699 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a8437dad-a272-4baa-915e-fa0d39fb9fd2-console-config\") pod \"console-fcb68766d-9t7jp\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:30:50.618767 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:50.618722 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8437dad-a272-4baa-915e-fa0d39fb9fd2-service-ca\") pod \"console-fcb68766d-9t7jp\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:30:50.618767 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:50.618759 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a8437dad-a272-4baa-915e-fa0d39fb9fd2-oauth-serving-cert\") pod \"console-fcb68766d-9t7jp\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:30:50.618896 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:50.618785 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8437dad-a272-4baa-915e-fa0d39fb9fd2-trusted-ca-bundle\") pod \"console-fcb68766d-9t7jp\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:30:50.618896 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:50.618805 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8437dad-a272-4baa-915e-fa0d39fb9fd2-console-serving-cert\") pod \"console-fcb68766d-9t7jp\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:30:50.719533 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:50.719503 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2v6l7\" (UniqueName: \"kubernetes.io/projected/a8437dad-a272-4baa-915e-fa0d39fb9fd2-kube-api-access-2v6l7\") pod \"console-fcb68766d-9t7jp\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:30:50.719658 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:50.719536 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a8437dad-a272-4baa-915e-fa0d39fb9fd2-console-oauth-config\") pod \"console-fcb68766d-9t7jp\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:30:50.719658 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:50.719559 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a8437dad-a272-4baa-915e-fa0d39fb9fd2-console-config\") pod \"console-fcb68766d-9t7jp\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:30:50.719658 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:50.719583 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8437dad-a272-4baa-915e-fa0d39fb9fd2-service-ca\") pod \"console-fcb68766d-9t7jp\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:30:50.719658 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:50.719619 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a8437dad-a272-4baa-915e-fa0d39fb9fd2-oauth-serving-cert\") pod \"console-fcb68766d-9t7jp\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:30:50.719658 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:50.719655 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8437dad-a272-4baa-915e-fa0d39fb9fd2-trusted-ca-bundle\") pod \"console-fcb68766d-9t7jp\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:30:50.719920 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:50.719683 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8437dad-a272-4baa-915e-fa0d39fb9fd2-console-serving-cert\") pod \"console-fcb68766d-9t7jp\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:30:50.720381 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:50.720354 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a8437dad-a272-4baa-915e-fa0d39fb9fd2-console-config\") pod \"console-fcb68766d-9t7jp\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:30:50.720381 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:50.720379 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8437dad-a272-4baa-915e-fa0d39fb9fd2-service-ca\") pod \"console-fcb68766d-9t7jp\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:30:50.721136 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:50.721118 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a8437dad-a272-4baa-915e-fa0d39fb9fd2-oauth-serving-cert\") pod \"console-fcb68766d-9t7jp\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:30:50.721504 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:50.721482 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8437dad-a272-4baa-915e-fa0d39fb9fd2-trusted-ca-bundle\") pod \"console-fcb68766d-9t7jp\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:30:50.723483 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:50.723462 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a8437dad-a272-4baa-915e-fa0d39fb9fd2-console-oauth-config\") pod \"console-fcb68766d-9t7jp\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:30:50.723570 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:50.723540 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8437dad-a272-4baa-915e-fa0d39fb9fd2-console-serving-cert\") pod \"console-fcb68766d-9t7jp\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:30:50.742850 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:50.742830 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v6l7\" (UniqueName: \"kubernetes.io/projected/a8437dad-a272-4baa-915e-fa0d39fb9fd2-kube-api-access-2v6l7\") pod \"console-fcb68766d-9t7jp\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:30:50.891193 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:50.891127 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:30:51.046173 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:51.046097 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-fcb68766d-9t7jp"] Apr 24 22:30:51.049254 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:30:51.049229 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8437dad_a272_4baa_915e_fa0d39fb9fd2.slice/crio-16ec3f3c0d98f6ec571667860bf67d2dbe6f554bf8898f3bfcafa70dfa10452f WatchSource:0}: Error finding container 16ec3f3c0d98f6ec571667860bf67d2dbe6f554bf8898f3bfcafa70dfa10452f: Status 404 returned error can't find the container with id 16ec3f3c0d98f6ec571667860bf67d2dbe6f554bf8898f3bfcafa70dfa10452f Apr 24 22:30:51.667611 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:51.667574 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fcb68766d-9t7jp" event={"ID":"a8437dad-a272-4baa-915e-fa0d39fb9fd2","Type":"ContainerStarted","Data":"0c858f4653bf39f4b7280e210fc8d9d6af7a985634c95f76efc072ad24cc886b"} Apr 24 22:30:51.667611 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:51.667611 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fcb68766d-9t7jp" event={"ID":"a8437dad-a272-4baa-915e-fa0d39fb9fd2","Type":"ContainerStarted","Data":"16ec3f3c0d98f6ec571667860bf67d2dbe6f554bf8898f3bfcafa70dfa10452f"} Apr 24 22:30:51.711729 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:51.711687 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-fcb68766d-9t7jp" podStartSLOduration=1.711673486 podStartE2EDuration="1.711673486s" podCreationTimestamp="2026-04-24 22:30:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:30:51.710596105 +0000 UTC m=+58.858901655" watchObservedRunningTime="2026-04-24 22:30:51.711673486 +0000 UTC m=+58.859979033" Apr 24 22:30:52.585564 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:52.585538 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x9trm" Apr 24 22:30:53.305354 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:53.305330 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-d49f586d7-2wfr7" Apr 24 22:30:53.305704 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:53.305599 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-d49f586d7-2wfr7" Apr 24 22:30:53.310303 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:53.310275 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-d49f586d7-2wfr7" Apr 24 22:30:53.676182 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:53.676102 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-d49f586d7-2wfr7" Apr 24 22:30:54.648158 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:54.648125 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vv7c4" Apr 24 22:30:59.077027 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:59.076959 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edbc33b8-02e4-43d1-a683-6dcd726340b7-metrics-certs\") pod \"network-metrics-daemon-v4wwp\" (UID: \"edbc33b8-02e4-43d1-a683-6dcd726340b7\") " pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:59.079926 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:59.079901 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 22:30:59.089805 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:59.089775 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edbc33b8-02e4-43d1-a683-6dcd726340b7-metrics-certs\") pod \"network-metrics-daemon-v4wwp\" (UID: \"edbc33b8-02e4-43d1-a683-6dcd726340b7\") " pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:59.225089 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:59.225060 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-74npb\"" Apr 24 22:30:59.232763 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:59.232742 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v4wwp" Apr 24 22:30:59.278771 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:59.278742 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtlkc\" (UniqueName: \"kubernetes.io/projected/6a2536b6-b046-4594-8305-33498ed4dadd-kube-api-access-mtlkc\") pod \"network-check-target-6q8zg\" (UID: \"6a2536b6-b046-4594-8305-33498ed4dadd\") " pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:59.282180 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:59.282159 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 22:30:59.292494 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:59.292446 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 22:30:59.302644 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:59.302624 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtlkc\" (UniqueName: \"kubernetes.io/projected/6a2536b6-b046-4594-8305-33498ed4dadd-kube-api-access-mtlkc\") pod \"network-check-target-6q8zg\" (UID: \"6a2536b6-b046-4594-8305-33498ed4dadd\") " pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:59.332041 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:59.331978 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-499pd\"" Apr 24 22:30:59.339583 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:59.339563 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:30:59.355743 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:59.355722 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-v4wwp"] Apr 24 22:30:59.357571 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:30:59.357542 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedbc33b8_02e4_43d1_a683_6dcd726340b7.slice/crio-43ab5efa9c0f61f98ac3890a9ae1910bb33c0647a3a2db7b26c9cded05a7234a WatchSource:0}: Error finding container 43ab5efa9c0f61f98ac3890a9ae1910bb33c0647a3a2db7b26c9cded05a7234a: Status 404 returned error can't find the container with id 43ab5efa9c0f61f98ac3890a9ae1910bb33c0647a3a2db7b26c9cded05a7234a Apr 24 22:30:59.447904 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:59.447880 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6q8zg"] Apr 24 22:30:59.450352 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:30:59.450324 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a2536b6_b046_4594_8305_33498ed4dadd.slice/crio-7a186c9c5812004d7166ecd5e25f17860ce5133e4adce13a5e1aea5fbbb617b0 WatchSource:0}: Error finding container 7a186c9c5812004d7166ecd5e25f17860ce5133e4adce13a5e1aea5fbbb617b0: Status 404 returned error can't find the container with id 7a186c9c5812004d7166ecd5e25f17860ce5133e4adce13a5e1aea5fbbb617b0 Apr 24 22:30:59.686883 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:59.686814 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6q8zg" event={"ID":"6a2536b6-b046-4594-8305-33498ed4dadd","Type":"ContainerStarted","Data":"7a186c9c5812004d7166ecd5e25f17860ce5133e4adce13a5e1aea5fbbb617b0"} Apr 24 22:30:59.687727 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:30:59.687703 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-v4wwp" event={"ID":"edbc33b8-02e4-43d1-a683-6dcd726340b7","Type":"ContainerStarted","Data":"43ab5efa9c0f61f98ac3890a9ae1910bb33c0647a3a2db7b26c9cded05a7234a"} Apr 24 22:31:00.693729 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:00.693695 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-v4wwp" event={"ID":"edbc33b8-02e4-43d1-a683-6dcd726340b7","Type":"ContainerStarted","Data":"6764e865370c552a0607cbe63b4068f1900e9e5fd9e4c592801064f475edf36f"} Apr 24 22:31:00.891539 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:00.891506 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:31:00.891676 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:00.891548 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:31:00.897111 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:00.897091 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:31:01.698805 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:01.698756 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-v4wwp" event={"ID":"edbc33b8-02e4-43d1-a683-6dcd726340b7","Type":"ContainerStarted","Data":"ac299208f6c06c6394494f330f9fd0ed6854553b623a49ad5674cf8d85753ed9"} Apr 24 22:31:01.703642 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:01.703610 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:31:01.715795 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:01.715757 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-v4wwp" podStartSLOduration=67.583697916 podStartE2EDuration="1m8.715743954s" podCreationTimestamp="2026-04-24 22:29:53 +0000 UTC" firstStartedPulling="2026-04-24 22:30:59.359555883 +0000 UTC m=+66.507861412" lastFinishedPulling="2026-04-24 22:31:00.491601911 +0000 UTC m=+67.639907450" observedRunningTime="2026-04-24 22:31:01.71538083 +0000 UTC m=+68.863686380" watchObservedRunningTime="2026-04-24 22:31:01.715743954 +0000 UTC m=+68.864049502" Apr 24 22:31:01.763365 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:01.763337 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d49f586d7-2wfr7"] Apr 24 22:31:02.703146 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:02.703108 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6q8zg" event={"ID":"6a2536b6-b046-4594-8305-33498ed4dadd","Type":"ContainerStarted","Data":"eecf77ef9267aa772b1dd694382af06b289b534a83106bbf34ed74b598aa48e7"} Apr 24 22:31:02.703671 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:02.703652 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:31:02.725579 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:02.725530 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-6q8zg" podStartSLOduration=67.045671474 podStartE2EDuration="1m9.725515504s" podCreationTimestamp="2026-04-24 22:29:53 +0000 UTC" firstStartedPulling="2026-04-24 22:30:59.452144089 +0000 UTC m=+66.600449615" lastFinishedPulling="2026-04-24 22:31:02.131988114 +0000 UTC m=+69.280293645" observedRunningTime="2026-04-24 22:31:02.723963919 +0000 UTC m=+69.872269466" watchObservedRunningTime="2026-04-24 22:31:02.725515504 +0000 UTC m=+69.873821066" Apr 24 22:31:05.042748 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:05.042710 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kw92"] Apr 24 22:31:05.045606 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:05.045590 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kw92" Apr 24 22:31:05.048830 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:05.048809 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 24 22:31:05.049102 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:05.049089 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-dz98z\"" Apr 24 22:31:05.055549 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:05.055530 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kw92"] Apr 24 22:31:05.115434 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:05.115407 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b1c4549d-5242-4209-9116-5088ce9fc89a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9kw92\" (UID: \"b1c4549d-5242-4209-9116-5088ce9fc89a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kw92" Apr 24 22:31:05.215681 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:05.215654 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b1c4549d-5242-4209-9116-5088ce9fc89a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9kw92\" (UID: \"b1c4549d-5242-4209-9116-5088ce9fc89a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kw92" Apr 24 22:31:05.215792 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:31:05.215776 2572 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 24 22:31:05.215870 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:31:05.215853 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1c4549d-5242-4209-9116-5088ce9fc89a-tls-certificates podName:b1c4549d-5242-4209-9116-5088ce9fc89a nodeName:}" failed. No retries permitted until 2026-04-24 22:31:05.71583068 +0000 UTC m=+72.864136223 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/b1c4549d-5242-4209-9116-5088ce9fc89a-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-9kw92" (UID: "b1c4549d-5242-4209-9116-5088ce9fc89a") : secret "prometheus-operator-admission-webhook-tls" not found Apr 24 22:31:05.720171 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:05.720143 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b1c4549d-5242-4209-9116-5088ce9fc89a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9kw92\" (UID: \"b1c4549d-5242-4209-9116-5088ce9fc89a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kw92" Apr 24 22:31:05.720284 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:31:05.720245 2572 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 24 22:31:05.720327 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:31:05.720297 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1c4549d-5242-4209-9116-5088ce9fc89a-tls-certificates podName:b1c4549d-5242-4209-9116-5088ce9fc89a nodeName:}" failed. No retries permitted until 2026-04-24 22:31:06.720281386 +0000 UTC m=+73.868586911 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/b1c4549d-5242-4209-9116-5088ce9fc89a-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-9kw92" (UID: "b1c4549d-5242-4209-9116-5088ce9fc89a") : secret "prometheus-operator-admission-webhook-tls" not found Apr 24 22:31:06.725656 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:06.725631 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b1c4549d-5242-4209-9116-5088ce9fc89a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9kw92\" (UID: \"b1c4549d-5242-4209-9116-5088ce9fc89a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kw92" Apr 24 22:31:06.728000 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:06.727973 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b1c4549d-5242-4209-9116-5088ce9fc89a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9kw92\" (UID: \"b1c4549d-5242-4209-9116-5088ce9fc89a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kw92" Apr 24 22:31:06.854633 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:06.854610 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kw92" Apr 24 22:31:06.972708 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:06.972686 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kw92"] Apr 24 22:31:06.974574 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:31:06.974547 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1c4549d_5242_4209_9116_5088ce9fc89a.slice/crio-007a6ef66c2cc2071b0a6013284b5a6208e7ca4fd9c89901ed611c7bfb4686d0 WatchSource:0}: Error finding container 007a6ef66c2cc2071b0a6013284b5a6208e7ca4fd9c89901ed611c7bfb4686d0: Status 404 returned error can't find the container with id 007a6ef66c2cc2071b0a6013284b5a6208e7ca4fd9c89901ed611c7bfb4686d0 Apr 24 22:31:07.718360 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:07.718322 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kw92" event={"ID":"b1c4549d-5242-4209-9116-5088ce9fc89a","Type":"ContainerStarted","Data":"007a6ef66c2cc2071b0a6013284b5a6208e7ca4fd9c89901ed611c7bfb4686d0"} Apr 24 22:31:08.724488 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:08.724452 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kw92" event={"ID":"b1c4549d-5242-4209-9116-5088ce9fc89a","Type":"ContainerStarted","Data":"44dfad2555a6da4bfd887c4603f51f5947b6957ca449e766f1d06d5b654b4e2f"} Apr 24 22:31:08.724885 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:08.724640 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kw92" Apr 24 22:31:08.730105 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:08.730082 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kw92" Apr 24 22:31:08.741578 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:08.741533 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kw92" podStartSLOduration=2.686357842 podStartE2EDuration="3.741522825s" podCreationTimestamp="2026-04-24 22:31:05 +0000 UTC" firstStartedPulling="2026-04-24 22:31:06.976446941 +0000 UTC m=+74.124752467" lastFinishedPulling="2026-04-24 22:31:08.031611913 +0000 UTC m=+75.179917450" observedRunningTime="2026-04-24 22:31:08.740537751 +0000 UTC m=+75.888843291" watchObservedRunningTime="2026-04-24 22:31:08.741522825 +0000 UTC m=+75.889828373" Apr 24 22:31:09.102413 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:09.102380 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-v84xt"] Apr 24 22:31:09.105618 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:09.105597 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-v84xt" Apr 24 22:31:09.109723 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:09.109689 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-b4s7r\"" Apr 24 22:31:09.109842 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:09.109824 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 22:31:09.109935 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:09.109903 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 22:31:09.110075 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:09.109954 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 24 22:31:09.110226 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:09.110207 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 22:31:09.110311 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:09.110224 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 24 22:31:09.117658 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:09.117632 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-v84xt"] Apr 24 22:31:09.143612 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:09.143589 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/19827ebc-0817-470a-95f6-3133e65a770d-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-v84xt\" (UID: \"19827ebc-0817-470a-95f6-3133e65a770d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v84xt" Apr 24 22:31:09.143690 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:09.143628 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/19827ebc-0817-470a-95f6-3133e65a770d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-v84xt\" (UID: \"19827ebc-0817-470a-95f6-3133e65a770d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v84xt" Apr 24 22:31:09.143690 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:09.143652 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6xk2\" (UniqueName: \"kubernetes.io/projected/19827ebc-0817-470a-95f6-3133e65a770d-kube-api-access-z6xk2\") pod \"prometheus-operator-5676c8c784-v84xt\" (UID: \"19827ebc-0817-470a-95f6-3133e65a770d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v84xt" Apr 24 22:31:09.143690 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:09.143683 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/19827ebc-0817-470a-95f6-3133e65a770d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-v84xt\" (UID: \"19827ebc-0817-470a-95f6-3133e65a770d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v84xt" Apr 24 22:31:09.244487 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:09.244464 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/19827ebc-0817-470a-95f6-3133e65a770d-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-v84xt\" (UID: \"19827ebc-0817-470a-95f6-3133e65a770d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v84xt" Apr 24 22:31:09.244602 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:09.244496 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/19827ebc-0817-470a-95f6-3133e65a770d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-v84xt\" (UID: \"19827ebc-0817-470a-95f6-3133e65a770d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v84xt" Apr 24 22:31:09.244602 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:09.244518 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6xk2\" (UniqueName: \"kubernetes.io/projected/19827ebc-0817-470a-95f6-3133e65a770d-kube-api-access-z6xk2\") pod \"prometheus-operator-5676c8c784-v84xt\" (UID: \"19827ebc-0817-470a-95f6-3133e65a770d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v84xt" Apr 24 22:31:09.244678 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:09.244637 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/19827ebc-0817-470a-95f6-3133e65a770d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-v84xt\" (UID: \"19827ebc-0817-470a-95f6-3133e65a770d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v84xt" Apr 24 22:31:09.245892 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:09.245864 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/19827ebc-0817-470a-95f6-3133e65a770d-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-v84xt\" (UID: \"19827ebc-0817-470a-95f6-3133e65a770d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v84xt" Apr 24 22:31:09.247284 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:09.247260 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/19827ebc-0817-470a-95f6-3133e65a770d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-v84xt\" (UID: \"19827ebc-0817-470a-95f6-3133e65a770d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v84xt" Apr 24 22:31:09.247628 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:09.247604 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/19827ebc-0817-470a-95f6-3133e65a770d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-v84xt\" (UID: \"19827ebc-0817-470a-95f6-3133e65a770d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v84xt" Apr 24 22:31:09.254032 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:09.253981 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6xk2\" (UniqueName: \"kubernetes.io/projected/19827ebc-0817-470a-95f6-3133e65a770d-kube-api-access-z6xk2\") pod \"prometheus-operator-5676c8c784-v84xt\" (UID: \"19827ebc-0817-470a-95f6-3133e65a770d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v84xt" Apr 24 22:31:09.416373 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:09.416314 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-v84xt" Apr 24 22:31:09.548358 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:09.548330 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-v84xt"] Apr 24 22:31:09.551211 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:31:09.551189 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19827ebc_0817_470a_95f6_3133e65a770d.slice/crio-325c7ea7b0edff5854047e3b7d3c9f5ac77086d6b4db772a1e22955bf17869f7 WatchSource:0}: Error finding container 325c7ea7b0edff5854047e3b7d3c9f5ac77086d6b4db772a1e22955bf17869f7: Status 404 returned error can't find the container with id 325c7ea7b0edff5854047e3b7d3c9f5ac77086d6b4db772a1e22955bf17869f7 Apr 24 22:31:09.727797 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:09.727739 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-v84xt" event={"ID":"19827ebc-0817-470a-95f6-3133e65a770d","Type":"ContainerStarted","Data":"325c7ea7b0edff5854047e3b7d3c9f5ac77086d6b4db772a1e22955bf17869f7"} Apr 24 22:31:11.735067 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:11.735004 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-v84xt" event={"ID":"19827ebc-0817-470a-95f6-3133e65a770d","Type":"ContainerStarted","Data":"1bbf608d9a28277fd66c56c99f29f3b1e793f4c65e366b3a69247ff0dd483c91"} Apr 24 22:31:11.735067 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:11.735062 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-v84xt" event={"ID":"19827ebc-0817-470a-95f6-3133e65a770d","Type":"ContainerStarted","Data":"bd5b9d035392fd145c663f7328a9dfaf3594378673a1780553c93c51e262676b"} Apr 24 22:31:11.755827 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:11.755773 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-v84xt" podStartSLOduration=1.613112173 podStartE2EDuration="2.755755504s" podCreationTimestamp="2026-04-24 22:31:09 +0000 UTC" firstStartedPulling="2026-04-24 22:31:09.553049804 +0000 UTC m=+76.701355334" lastFinishedPulling="2026-04-24 22:31:10.695693135 +0000 UTC m=+77.843998665" observedRunningTime="2026-04-24 22:31:11.755693432 +0000 UTC m=+78.903998981" watchObservedRunningTime="2026-04-24 22:31:11.755755504 +0000 UTC m=+78.904061054" Apr 24 22:31:13.520029 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.519991 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-s4c96"] Apr 24 22:31:13.524139 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.524118 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s4c96" Apr 24 22:31:13.533352 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.533331 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 24 22:31:13.533480 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.533464 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-vmwr2\"" Apr 24 22:31:13.533572 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.533555 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 24 22:31:13.542219 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.542202 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-s4c96"] Apr 24 22:31:13.551104 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.551085 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-9fmp9"] Apr 24 22:31:13.554024 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.553996 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-9fmp9" Apr 24 22:31:13.556738 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.556720 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 24 22:31:13.557057 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.557044 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 24 22:31:13.557189 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.557173 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-jlllf\"" Apr 24 22:31:13.557415 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.557399 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 24 22:31:13.573683 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.573659 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-s4c96\" (UID: \"e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s4c96" Apr 24 22:31:13.573777 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.573693 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/b907e1af-4d6b-43b5-9af8-d5f2e469c573-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-9fmp9\" (UID: \"b907e1af-4d6b-43b5-9af8-d5f2e469c573\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fmp9" Apr 24 22:31:13.573777 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.573720 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpdlr\" (UniqueName: \"kubernetes.io/projected/b907e1af-4d6b-43b5-9af8-d5f2e469c573-kube-api-access-cpdlr\") pod \"kube-state-metrics-69db897b98-9fmp9\" (UID: \"b907e1af-4d6b-43b5-9af8-d5f2e469c573\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fmp9" Apr 24 22:31:13.573860 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.573820 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b907e1af-4d6b-43b5-9af8-d5f2e469c573-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-9fmp9\" (UID: \"b907e1af-4d6b-43b5-9af8-d5f2e469c573\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fmp9" Apr 24 22:31:13.573860 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.573841 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b907e1af-4d6b-43b5-9af8-d5f2e469c573-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-9fmp9\" (UID: \"b907e1af-4d6b-43b5-9af8-d5f2e469c573\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fmp9" Apr 24 22:31:13.573959 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.573866 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b907e1af-4d6b-43b5-9af8-d5f2e469c573-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-9fmp9\" (UID: \"b907e1af-4d6b-43b5-9af8-d5f2e469c573\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fmp9" Apr 24 22:31:13.573959 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.573927 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-s4c96\" (UID: \"e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s4c96" Apr 24 22:31:13.574088 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.573959 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-s4c96\" (UID: \"e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s4c96" Apr 24 22:31:13.574088 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.573987 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w74dh\" (UniqueName: \"kubernetes.io/projected/e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48-kube-api-access-w74dh\") pod \"openshift-state-metrics-9d44df66c-s4c96\" (UID: \"e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s4c96" Apr 24 22:31:13.574088 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.574034 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/b907e1af-4d6b-43b5-9af8-d5f2e469c573-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-9fmp9\" (UID: \"b907e1af-4d6b-43b5-9af8-d5f2e469c573\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fmp9" Apr 24 22:31:13.594692 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.594663 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-9fmp9"] Apr 24 22:31:13.596848 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.596824 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-8bsz5"] Apr 24 22:31:13.600669 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.600653 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:13.606293 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:31:13.606263 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"node-exporter-tls\" is forbidden: User \"system:node:ip-10-0-137-103.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'ip-10-0-137-103.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" type="*v1.Secret" Apr 24 22:31:13.606409 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:31:13.606329 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"node-exporter-dockercfg-8plhd\" is forbidden: User \"system:node:ip-10-0-137-103.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'ip-10-0-137-103.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-8plhd\"" type="*v1.Secret" Apr 24 22:31:13.606501 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:31:13.606381 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"node-exporter-kube-rbac-proxy-config\" is forbidden: User \"system:node:ip-10-0-137-103.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'ip-10-0-137-103.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" type="*v1.Secret" Apr 24 22:31:13.606664 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:31:13.606642 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"node-exporter-accelerators-collector-config\" is forbidden: User \"system:node:ip-10-0-137-103.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'ip-10-0-137-103.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" type="*v1.ConfigMap" Apr 24 22:31:13.674760 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.674733 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-s4c96\" (UID: \"e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s4c96" Apr 24 22:31:13.674873 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.674778 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f3573d49-4d97-426a-86f2-6e6731507efa-node-exporter-wtmp\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:13.674873 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.674809 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/b907e1af-4d6b-43b5-9af8-d5f2e469c573-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-9fmp9\" (UID: \"b907e1af-4d6b-43b5-9af8-d5f2e469c573\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fmp9" Apr 24 22:31:13.674873 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.674833 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpdlr\" (UniqueName: \"kubernetes.io/projected/b907e1af-4d6b-43b5-9af8-d5f2e469c573-kube-api-access-cpdlr\") pod \"kube-state-metrics-69db897b98-9fmp9\" (UID: \"b907e1af-4d6b-43b5-9af8-d5f2e469c573\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fmp9" Apr 24 22:31:13.675076 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.674865 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lggvw\" (UniqueName: \"kubernetes.io/projected/f3573d49-4d97-426a-86f2-6e6731507efa-kube-api-access-lggvw\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:13.675076 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.674911 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b907e1af-4d6b-43b5-9af8-d5f2e469c573-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-9fmp9\" (UID: \"b907e1af-4d6b-43b5-9af8-d5f2e469c573\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fmp9" Apr 24 22:31:13.675076 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:31:13.675053 2572 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 24 22:31:13.675216 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.675093 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b907e1af-4d6b-43b5-9af8-d5f2e469c573-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-9fmp9\" (UID: \"b907e1af-4d6b-43b5-9af8-d5f2e469c573\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fmp9" Apr 24 22:31:13.675216 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:31:13.675107 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b907e1af-4d6b-43b5-9af8-d5f2e469c573-kube-state-metrics-tls podName:b907e1af-4d6b-43b5-9af8-d5f2e469c573 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:14.175087946 +0000 UTC m=+81.323393477 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/b907e1af-4d6b-43b5-9af8-d5f2e469c573-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-9fmp9" (UID: "b907e1af-4d6b-43b5-9af8-d5f2e469c573") : secret "kube-state-metrics-tls" not found Apr 24 22:31:13.675216 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.675147 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f3573d49-4d97-426a-86f2-6e6731507efa-node-exporter-textfile\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:13.675216 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.675209 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b907e1af-4d6b-43b5-9af8-d5f2e469c573-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-9fmp9\" (UID: \"b907e1af-4d6b-43b5-9af8-d5f2e469c573\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fmp9" Apr 24 22:31:13.675427 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.675242 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/b907e1af-4d6b-43b5-9af8-d5f2e469c573-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-9fmp9\" (UID: \"b907e1af-4d6b-43b5-9af8-d5f2e469c573\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fmp9" Apr 24 22:31:13.675427 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.675300 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f3573d49-4d97-426a-86f2-6e6731507efa-root\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:13.675427 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.675355 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-s4c96\" (UID: \"e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s4c96" Apr 24 22:31:13.675427 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.675385 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f3573d49-4d97-426a-86f2-6e6731507efa-node-exporter-tls\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:13.675622 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.675434 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f3573d49-4d97-426a-86f2-6e6731507efa-node-exporter-accelerators-collector-config\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:13.675622 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.675467 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-s4c96\" (UID: \"e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s4c96" Apr 24 22:31:13.675622 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.675518 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w74dh\" (UniqueName: \"kubernetes.io/projected/e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48-kube-api-access-w74dh\") pod \"openshift-state-metrics-9d44df66c-s4c96\" (UID: \"e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s4c96" Apr 24 22:31:13.675622 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.675543 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/b907e1af-4d6b-43b5-9af8-d5f2e469c573-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-9fmp9\" (UID: \"b907e1af-4d6b-43b5-9af8-d5f2e469c573\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fmp9" Apr 24 22:31:13.675622 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:31:13.675559 2572 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 24 22:31:13.675622 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.675591 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f3573d49-4d97-426a-86f2-6e6731507efa-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:13.675622 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:31:13.675623 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48-openshift-state-metrics-tls podName:e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:14.175604655 +0000 UTC m=+81.323910184 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-s4c96" (UID: "e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48") : secret "openshift-state-metrics-tls" not found Apr 24 22:31:13.676007 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.675667 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f3573d49-4d97-426a-86f2-6e6731507efa-sys\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:13.676007 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.675707 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3573d49-4d97-426a-86f2-6e6731507efa-metrics-client-ca\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:13.676279 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.676258 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/b907e1af-4d6b-43b5-9af8-d5f2e469c573-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-9fmp9\" (UID: \"b907e1af-4d6b-43b5-9af8-d5f2e469c573\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fmp9" Apr 24 22:31:13.676525 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.676495 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-s4c96\" (UID: \"e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s4c96" Apr 24 22:31:13.676858 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.676803 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b907e1af-4d6b-43b5-9af8-d5f2e469c573-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-9fmp9\" (UID: \"b907e1af-4d6b-43b5-9af8-d5f2e469c573\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fmp9" Apr 24 22:31:13.677572 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.677551 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-s4c96\" (UID: \"e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s4c96" Apr 24 22:31:13.678271 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.678253 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b907e1af-4d6b-43b5-9af8-d5f2e469c573-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-9fmp9\" (UID: \"b907e1af-4d6b-43b5-9af8-d5f2e469c573\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fmp9" Apr 24 22:31:13.691613 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.691594 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w74dh\" (UniqueName: \"kubernetes.io/projected/e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48-kube-api-access-w74dh\") pod \"openshift-state-metrics-9d44df66c-s4c96\" (UID: \"e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s4c96" Apr 24 22:31:13.692046 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.692031 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpdlr\" (UniqueName: \"kubernetes.io/projected/b907e1af-4d6b-43b5-9af8-d5f2e469c573-kube-api-access-cpdlr\") pod \"kube-state-metrics-69db897b98-9fmp9\" (UID: \"b907e1af-4d6b-43b5-9af8-d5f2e469c573\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fmp9" Apr 24 22:31:13.776193 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.776138 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f3573d49-4d97-426a-86f2-6e6731507efa-node-exporter-wtmp\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:13.776193 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.776169 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lggvw\" (UniqueName: \"kubernetes.io/projected/f3573d49-4d97-426a-86f2-6e6731507efa-kube-api-access-lggvw\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:13.776339 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.776197 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f3573d49-4d97-426a-86f2-6e6731507efa-node-exporter-textfile\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:13.776339 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.776229 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f3573d49-4d97-426a-86f2-6e6731507efa-root\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:13.776339 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.776261 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f3573d49-4d97-426a-86f2-6e6731507efa-node-exporter-tls\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:13.776339 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.776279 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f3573d49-4d97-426a-86f2-6e6731507efa-node-exporter-accelerators-collector-config\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:13.776339 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.776299 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f3573d49-4d97-426a-86f2-6e6731507efa-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:13.776339 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.776322 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f3573d49-4d97-426a-86f2-6e6731507efa-sys\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:13.776339 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.776329 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f3573d49-4d97-426a-86f2-6e6731507efa-node-exporter-wtmp\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:13.776584 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.776351 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3573d49-4d97-426a-86f2-6e6731507efa-metrics-client-ca\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:13.776584 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.776403 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f3573d49-4d97-426a-86f2-6e6731507efa-sys\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:13.776584 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.776448 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f3573d49-4d97-426a-86f2-6e6731507efa-root\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:13.776584 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.776556 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f3573d49-4d97-426a-86f2-6e6731507efa-node-exporter-textfile\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:13.776767 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.776750 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3573d49-4d97-426a-86f2-6e6731507efa-metrics-client-ca\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:13.785595 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:13.785576 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lggvw\" (UniqueName: \"kubernetes.io/projected/f3573d49-4d97-426a-86f2-6e6731507efa-kube-api-access-lggvw\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:14.179428 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.179361 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-s4c96\" (UID: \"e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s4c96" Apr 24 22:31:14.179578 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.179433 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b907e1af-4d6b-43b5-9af8-d5f2e469c573-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-9fmp9\" (UID: \"b907e1af-4d6b-43b5-9af8-d5f2e469c573\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fmp9" Apr 24 22:31:14.179578 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:31:14.179491 2572 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 24 22:31:14.179578 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:31:14.179570 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48-openshift-state-metrics-tls podName:e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:15.179555884 +0000 UTC m=+82.327861410 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-s4c96" (UID: "e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48") : secret "openshift-state-metrics-tls" not found Apr 24 22:31:14.181725 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.181698 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b907e1af-4d6b-43b5-9af8-d5f2e469c573-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-9fmp9\" (UID: \"b907e1af-4d6b-43b5-9af8-d5f2e469c573\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fmp9" Apr 24 22:31:14.462354 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.462297 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-9fmp9" Apr 24 22:31:14.593749 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.593723 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-9fmp9"] Apr 24 22:31:14.596076 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:31:14.596048 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb907e1af_4d6b_43b5_9af8_d5f2e469c573.slice/crio-8e6ca13b400b8508f307f581a25f3448f567fae3c8134e7b4138251afb58844c WatchSource:0}: Error finding container 8e6ca13b400b8508f307f581a25f3448f567fae3c8134e7b4138251afb58844c: Status 404 returned error can't find the container with id 8e6ca13b400b8508f307f581a25f3448f567fae3c8134e7b4138251afb58844c Apr 24 22:31:14.604143 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.604120 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 22:31:14.609154 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.609131 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-8plhd\"" Apr 24 22:31:14.610522 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.610497 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.613185 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.613169 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 22:31:14.613672 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.613657 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 22:31:14.613777 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.613758 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 22:31:14.613879 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.613765 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 22:31:14.614042 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.614027 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 22:31:14.614395 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.614375 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 22:31:14.614489 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.614450 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 22:31:14.614728 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.614714 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 22:31:14.614994 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.614973 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-x8kh2\"" Apr 24 22:31:14.622563 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.622545 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 22:31:14.648478 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.648199 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 22:31:14.683913 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.683841 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e44a6833-98fd-4227-8575-b155c7daa7df-web-config\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.683913 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.683902 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd5kw\" (UniqueName: \"kubernetes.io/projected/e44a6833-98fd-4227-8575-b155c7daa7df-kube-api-access-vd5kw\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.684134 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.683963 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e44a6833-98fd-4227-8575-b155c7daa7df-config-volume\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.684134 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.684030 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e44a6833-98fd-4227-8575-b155c7daa7df-config-out\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.684134 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.684067 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e44a6833-98fd-4227-8575-b155c7daa7df-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.684134 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.684100 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e44a6833-98fd-4227-8575-b155c7daa7df-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.684134 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.684134 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e44a6833-98fd-4227-8575-b155c7daa7df-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.684400 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.684162 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e44a6833-98fd-4227-8575-b155c7daa7df-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.684400 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.684188 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e44a6833-98fd-4227-8575-b155c7daa7df-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.684400 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.684228 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e44a6833-98fd-4227-8575-b155c7daa7df-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.684400 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.684296 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e44a6833-98fd-4227-8575-b155c7daa7df-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.684400 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.684344 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e44a6833-98fd-4227-8575-b155c7daa7df-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.684400 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.684366 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e44a6833-98fd-4227-8575-b155c7daa7df-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.745544 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.745510 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9fmp9" event={"ID":"b907e1af-4d6b-43b5-9af8-d5f2e469c573","Type":"ContainerStarted","Data":"8e6ca13b400b8508f307f581a25f3448f567fae3c8134e7b4138251afb58844c"} Apr 24 22:31:14.776663 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:31:14.776643 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: failed to sync secret cache: timed out waiting for the condition Apr 24 22:31:14.776663 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:31:14.776655 2572 configmap.go:193] Couldn't get configMap openshift-monitoring/node-exporter-accelerators-collector-config: failed to sync configmap cache: timed out waiting for the condition Apr 24 22:31:14.776790 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:31:14.776704 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3573d49-4d97-426a-86f2-6e6731507efa-node-exporter-tls podName:f3573d49-4d97-426a-86f2-6e6731507efa nodeName:}" failed. No retries permitted until 2026-04-24 22:31:15.276688019 +0000 UTC m=+82.424993544 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/f3573d49-4d97-426a-86f2-6e6731507efa-node-exporter-tls") pod "node-exporter-8bsz5" (UID: "f3573d49-4d97-426a-86f2-6e6731507efa") : failed to sync secret cache: timed out waiting for the condition Apr 24 22:31:14.776790 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:31:14.776649 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Apr 24 22:31:14.776790 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:31:14.776718 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f3573d49-4d97-426a-86f2-6e6731507efa-node-exporter-accelerators-collector-config podName:f3573d49-4d97-426a-86f2-6e6731507efa nodeName:}" failed. No retries permitted until 2026-04-24 22:31:15.276711731 +0000 UTC m=+82.425017256 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-accelerators-collector-config" (UniqueName: "kubernetes.io/configmap/f3573d49-4d97-426a-86f2-6e6731507efa-node-exporter-accelerators-collector-config") pod "node-exporter-8bsz5" (UID: "f3573d49-4d97-426a-86f2-6e6731507efa") : failed to sync configmap cache: timed out waiting for the condition Apr 24 22:31:14.776790 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:31:14.776737 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3573d49-4d97-426a-86f2-6e6731507efa-node-exporter-kube-rbac-proxy-config podName:f3573d49-4d97-426a-86f2-6e6731507efa nodeName:}" failed. No retries permitted until 2026-04-24 22:31:15.276724379 +0000 UTC m=+82.425029906 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/f3573d49-4d97-426a-86f2-6e6731507efa-node-exporter-kube-rbac-proxy-config") pod "node-exporter-8bsz5" (UID: "f3573d49-4d97-426a-86f2-6e6731507efa") : failed to sync secret cache: timed out waiting for the condition Apr 24 22:31:14.785145 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.785125 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e44a6833-98fd-4227-8575-b155c7daa7df-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.785241 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.785151 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e44a6833-98fd-4227-8575-b155c7daa7df-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.785241 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.785175 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e44a6833-98fd-4227-8575-b155c7daa7df-web-config\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.785241 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.785192 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vd5kw\" (UniqueName: \"kubernetes.io/projected/e44a6833-98fd-4227-8575-b155c7daa7df-kube-api-access-vd5kw\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.785241 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.785209 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e44a6833-98fd-4227-8575-b155c7daa7df-config-volume\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.785241 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.785238 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e44a6833-98fd-4227-8575-b155c7daa7df-config-out\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.785485 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.785268 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e44a6833-98fd-4227-8575-b155c7daa7df-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.785485 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.785297 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e44a6833-98fd-4227-8575-b155c7daa7df-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.785485 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.785328 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e44a6833-98fd-4227-8575-b155c7daa7df-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.785485 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.785354 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e44a6833-98fd-4227-8575-b155c7daa7df-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.785485 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.785378 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e44a6833-98fd-4227-8575-b155c7daa7df-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.785485 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.785436 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e44a6833-98fd-4227-8575-b155c7daa7df-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.785485 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.785478 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e44a6833-98fd-4227-8575-b155c7daa7df-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.785915 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.785894 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e44a6833-98fd-4227-8575-b155c7daa7df-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.786380 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.786354 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e44a6833-98fd-4227-8575-b155c7daa7df-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.788069 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.788040 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e44a6833-98fd-4227-8575-b155c7daa7df-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.788296 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.788269 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e44a6833-98fd-4227-8575-b155c7daa7df-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.788391 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.788315 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e44a6833-98fd-4227-8575-b155c7daa7df-config-volume\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.788443 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.788425 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e44a6833-98fd-4227-8575-b155c7daa7df-web-config\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.788490 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.788454 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e44a6833-98fd-4227-8575-b155c7daa7df-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.788549 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.788502 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e44a6833-98fd-4227-8575-b155c7daa7df-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.788816 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.788795 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e44a6833-98fd-4227-8575-b155c7daa7df-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.790024 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.790000 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e44a6833-98fd-4227-8575-b155c7daa7df-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.794166 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.794145 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e44a6833-98fd-4227-8575-b155c7daa7df-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.795405 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.795382 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e44a6833-98fd-4227-8575-b155c7daa7df-config-out\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.796463 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.796442 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd5kw\" (UniqueName: \"kubernetes.io/projected/e44a6833-98fd-4227-8575-b155c7daa7df-kube-api-access-vd5kw\") pod \"alertmanager-main-0\" (UID: \"e44a6833-98fd-4227-8575-b155c7daa7df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:14.920465 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:14.920440 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:31:15.054278 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:15.054209 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 22:31:15.060122 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:15.060096 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 22:31:15.063001 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:31:15.062959 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode44a6833_98fd_4227_8575_b155c7daa7df.slice/crio-3667aaf99a903f67f5de2a5f29fb0e29eb80d6ca335c3015053bdf8cdd05591b WatchSource:0}: Error finding container 3667aaf99a903f67f5de2a5f29fb0e29eb80d6ca335c3015053bdf8cdd05591b: Status 404 returned error can't find the container with id 3667aaf99a903f67f5de2a5f29fb0e29eb80d6ca335c3015053bdf8cdd05591b Apr 24 22:31:15.114689 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:15.114665 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 22:31:15.177816 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:15.177793 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 22:31:15.190317 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:15.190296 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-s4c96\" (UID: \"e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s4c96" Apr 24 22:31:15.192747 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:15.192722 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-s4c96\" (UID: \"e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s4c96" Apr 24 22:31:15.291184 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:15.291154 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f3573d49-4d97-426a-86f2-6e6731507efa-node-exporter-tls\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:15.291285 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:15.291203 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f3573d49-4d97-426a-86f2-6e6731507efa-node-exporter-accelerators-collector-config\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:15.291485 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:15.291392 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f3573d49-4d97-426a-86f2-6e6731507efa-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:15.291737 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:15.291714 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f3573d49-4d97-426a-86f2-6e6731507efa-node-exporter-accelerators-collector-config\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:15.293921 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:15.293899 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f3573d49-4d97-426a-86f2-6e6731507efa-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:15.294027 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:15.293918 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f3573d49-4d97-426a-86f2-6e6731507efa-node-exporter-tls\") pod \"node-exporter-8bsz5\" (UID: \"f3573d49-4d97-426a-86f2-6e6731507efa\") " pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:15.332859 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:15.332795 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s4c96" Apr 24 22:31:15.408482 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:15.408458 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8bsz5" Apr 24 22:31:15.420526 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:31:15.420487 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3573d49_4d97_426a_86f2_6e6731507efa.slice/crio-65db5329db291be2ed30024f812459657833491eb424b07b661bce884d3d5260 WatchSource:0}: Error finding container 65db5329db291be2ed30024f812459657833491eb424b07b661bce884d3d5260: Status 404 returned error can't find the container with id 65db5329db291be2ed30024f812459657833491eb424b07b661bce884d3d5260 Apr 24 22:31:15.472063 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:15.471994 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-s4c96"] Apr 24 22:31:15.752368 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:15.752331 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e44a6833-98fd-4227-8575-b155c7daa7df","Type":"ContainerStarted","Data":"3667aaf99a903f67f5de2a5f29fb0e29eb80d6ca335c3015053bdf8cdd05591b"} Apr 24 22:31:15.753442 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:15.753412 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8bsz5" event={"ID":"f3573d49-4d97-426a-86f2-6e6731507efa","Type":"ContainerStarted","Data":"65db5329db291be2ed30024f812459657833491eb424b07b661bce884d3d5260"} Apr 24 22:31:15.778636 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:31:15.778603 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2ecbf78_e58f_4d9d_88ab_d6b3278c3e48.slice/crio-3816d6ddc3fbfe35f25233058603b1b4f58aec6429fa31650c085191b4931936 WatchSource:0}: Error finding container 3816d6ddc3fbfe35f25233058603b1b4f58aec6429fa31650c085191b4931936: Status 404 returned error can't find the container with id 3816d6ddc3fbfe35f25233058603b1b4f58aec6429fa31650c085191b4931936 Apr 24 22:31:16.758704 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:16.758620 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s4c96" event={"ID":"e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48","Type":"ContainerStarted","Data":"82dbfbebd40758cb532a0c79971e3b74b55fb4cb7d7e60f47d348d78d261cd4e"} Apr 24 22:31:16.758704 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:16.758671 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s4c96" event={"ID":"e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48","Type":"ContainerStarted","Data":"feab88841299709aed24df372f2b27362d6b511a15f48a52cb4271b14b3cebd5"} Apr 24 22:31:16.758704 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:16.758684 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s4c96" event={"ID":"e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48","Type":"ContainerStarted","Data":"3816d6ddc3fbfe35f25233058603b1b4f58aec6429fa31650c085191b4931936"} Apr 24 22:31:16.760176 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:16.760145 2572 generic.go:358] "Generic (PLEG): container finished" podID="e44a6833-98fd-4227-8575-b155c7daa7df" containerID="c361d6aab31845f862ab61f9a23bded3f6d134bc2341e90ae8963434759f7f30" exitCode=0 Apr 24 22:31:16.760281 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:16.760237 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e44a6833-98fd-4227-8575-b155c7daa7df","Type":"ContainerDied","Data":"c361d6aab31845f862ab61f9a23bded3f6d134bc2341e90ae8963434759f7f30"} Apr 24 22:31:16.761828 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:16.761806 2572 generic.go:358] "Generic (PLEG): container finished" podID="f3573d49-4d97-426a-86f2-6e6731507efa" containerID="4b52de6b6a4a290abf3610771706e922b47b0c6db9bef08ed2a13bcb1fe36f16" exitCode=0 Apr 24 22:31:16.761931 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:16.761902 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8bsz5" event={"ID":"f3573d49-4d97-426a-86f2-6e6731507efa","Type":"ContainerDied","Data":"4b52de6b6a4a290abf3610771706e922b47b0c6db9bef08ed2a13bcb1fe36f16"} Apr 24 22:31:16.764093 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:16.764067 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9fmp9" event={"ID":"b907e1af-4d6b-43b5-9af8-d5f2e469c573","Type":"ContainerStarted","Data":"982179b1ffefe6bafb85e735daddbf019cd402387cbe970234ffd8b72736989b"} Apr 24 22:31:16.764174 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:16.764098 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9fmp9" event={"ID":"b907e1af-4d6b-43b5-9af8-d5f2e469c573","Type":"ContainerStarted","Data":"fa649a65635968eba147034a24030e29f0adec97c7400deff116be1d08616fae"} Apr 24 22:31:16.764174 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:16.764111 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9fmp9" event={"ID":"b907e1af-4d6b-43b5-9af8-d5f2e469c573","Type":"ContainerStarted","Data":"0e1c0e939ff038b26c5aa9e12ef8f9c5d740fd3e0ee8ca36d71c4046454388f8"} Apr 24 22:31:16.818685 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:16.818630 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-9fmp9" podStartSLOduration=2.348502844 podStartE2EDuration="3.81861661s" podCreationTimestamp="2026-04-24 22:31:13 +0000 UTC" firstStartedPulling="2026-04-24 22:31:14.598237892 +0000 UTC m=+81.746543432" lastFinishedPulling="2026-04-24 22:31:16.068351658 +0000 UTC m=+83.216657198" observedRunningTime="2026-04-24 22:31:16.816754192 +0000 UTC m=+83.965059762" watchObservedRunningTime="2026-04-24 22:31:16.81861661 +0000 UTC m=+83.966922136" Apr 24 22:31:17.771266 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:17.771085 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8bsz5" event={"ID":"f3573d49-4d97-426a-86f2-6e6731507efa","Type":"ContainerStarted","Data":"792c66c5d814d1f960c8d10e1e3094e4d2c61cff1b32941f7e3cd944a9830c4e"} Apr 24 22:31:17.771266 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:17.771133 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8bsz5" event={"ID":"f3573d49-4d97-426a-86f2-6e6731507efa","Type":"ContainerStarted","Data":"48b7b024f5c57bac096227c10d5934c2233e34d9c5d8f7d96b7b704e8dbfd270"} Apr 24 22:31:17.773041 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:17.772999 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s4c96" event={"ID":"e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48","Type":"ContainerStarted","Data":"e1884e6540e8c7f4ef52f1bbb071a6bf4a5c1364cd0a80c84dece95047bc814a"} Apr 24 22:31:17.793883 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:17.793830 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-8bsz5" podStartSLOduration=3.707661693 podStartE2EDuration="4.793815186s" podCreationTimestamp="2026-04-24 22:31:13 +0000 UTC" firstStartedPulling="2026-04-24 22:31:15.422482458 +0000 UTC m=+82.570787990" lastFinishedPulling="2026-04-24 22:31:16.508635944 +0000 UTC m=+83.656941483" observedRunningTime="2026-04-24 22:31:17.792398287 +0000 UTC m=+84.940703837" watchObservedRunningTime="2026-04-24 22:31:17.793815186 +0000 UTC m=+84.942120736" Apr 24 22:31:17.812528 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:17.812478 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-s4c96" podStartSLOduration=3.983726225 podStartE2EDuration="4.812462643s" podCreationTimestamp="2026-04-24 22:31:13 +0000 UTC" firstStartedPulling="2026-04-24 22:31:16.273160275 +0000 UTC m=+83.421465802" lastFinishedPulling="2026-04-24 22:31:17.101896691 +0000 UTC m=+84.250202220" observedRunningTime="2026-04-24 22:31:17.811158995 +0000 UTC m=+84.959464569" watchObservedRunningTime="2026-04-24 22:31:17.812462643 +0000 UTC m=+84.960768192" Apr 24 22:31:17.852526 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:17.852501 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-594c959ccb-pdjc9"] Apr 24 22:31:17.856700 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:17.856680 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" Apr 24 22:31:17.859799 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:17.859769 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 24 22:31:17.859898 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:17.859795 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 22:31:17.859983 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:17.859963 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-5rl824tongeip\"" Apr 24 22:31:17.860251 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:17.860235 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 24 22:31:17.860378 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:17.860356 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-vrgpm\"" Apr 24 22:31:17.860456 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:17.860444 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 24 22:31:17.864817 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:17.864781 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-594c959ccb-pdjc9"] Apr 24 22:31:17.915844 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:17.915817 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb24s\" (UniqueName: \"kubernetes.io/projected/81ec86e8-c0e4-4de3-89b8-6f4a08425ad4-kube-api-access-lb24s\") pod \"metrics-server-594c959ccb-pdjc9\" (UID: \"81ec86e8-c0e4-4de3-89b8-6f4a08425ad4\") " pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" Apr 24 22:31:17.915984 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:17.915852 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81ec86e8-c0e4-4de3-89b8-6f4a08425ad4-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-594c959ccb-pdjc9\" (UID: \"81ec86e8-c0e4-4de3-89b8-6f4a08425ad4\") " pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" Apr 24 22:31:17.915984 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:17.915880 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/81ec86e8-c0e4-4de3-89b8-6f4a08425ad4-audit-log\") pod \"metrics-server-594c959ccb-pdjc9\" (UID: \"81ec86e8-c0e4-4de3-89b8-6f4a08425ad4\") " pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" Apr 24 22:31:17.915984 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:17.915950 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/81ec86e8-c0e4-4de3-89b8-6f4a08425ad4-secret-metrics-server-tls\") pod \"metrics-server-594c959ccb-pdjc9\" (UID: \"81ec86e8-c0e4-4de3-89b8-6f4a08425ad4\") " pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" Apr 24 22:31:17.915984 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:17.915972 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ec86e8-c0e4-4de3-89b8-6f4a08425ad4-client-ca-bundle\") pod \"metrics-server-594c959ccb-pdjc9\" (UID: \"81ec86e8-c0e4-4de3-89b8-6f4a08425ad4\") " pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" Apr 24 22:31:17.916233 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:17.916049 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/81ec86e8-c0e4-4de3-89b8-6f4a08425ad4-secret-metrics-server-client-certs\") pod \"metrics-server-594c959ccb-pdjc9\" (UID: \"81ec86e8-c0e4-4de3-89b8-6f4a08425ad4\") " pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" Apr 24 22:31:17.916233 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:17.916098 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/81ec86e8-c0e4-4de3-89b8-6f4a08425ad4-metrics-server-audit-profiles\") pod \"metrics-server-594c959ccb-pdjc9\" (UID: \"81ec86e8-c0e4-4de3-89b8-6f4a08425ad4\") " pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" Apr 24 22:31:18.016656 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.016628 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/81ec86e8-c0e4-4de3-89b8-6f4a08425ad4-secret-metrics-server-tls\") pod \"metrics-server-594c959ccb-pdjc9\" (UID: \"81ec86e8-c0e4-4de3-89b8-6f4a08425ad4\") " pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" Apr 24 22:31:18.016656 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.016660 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ec86e8-c0e4-4de3-89b8-6f4a08425ad4-client-ca-bundle\") pod \"metrics-server-594c959ccb-pdjc9\" (UID: \"81ec86e8-c0e4-4de3-89b8-6f4a08425ad4\") " pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" Apr 24 22:31:18.016817 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.016681 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/81ec86e8-c0e4-4de3-89b8-6f4a08425ad4-secret-metrics-server-client-certs\") pod \"metrics-server-594c959ccb-pdjc9\" (UID: \"81ec86e8-c0e4-4de3-89b8-6f4a08425ad4\") " pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" Apr 24 22:31:18.016817 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.016791 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/81ec86e8-c0e4-4de3-89b8-6f4a08425ad4-metrics-server-audit-profiles\") pod \"metrics-server-594c959ccb-pdjc9\" (UID: \"81ec86e8-c0e4-4de3-89b8-6f4a08425ad4\") " pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" Apr 24 22:31:18.016887 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.016824 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lb24s\" (UniqueName: \"kubernetes.io/projected/81ec86e8-c0e4-4de3-89b8-6f4a08425ad4-kube-api-access-lb24s\") pod \"metrics-server-594c959ccb-pdjc9\" (UID: \"81ec86e8-c0e4-4de3-89b8-6f4a08425ad4\") " pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" Apr 24 22:31:18.016887 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.016855 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81ec86e8-c0e4-4de3-89b8-6f4a08425ad4-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-594c959ccb-pdjc9\" (UID: \"81ec86e8-c0e4-4de3-89b8-6f4a08425ad4\") " pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" Apr 24 22:31:18.016971 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.016887 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/81ec86e8-c0e4-4de3-89b8-6f4a08425ad4-audit-log\") pod \"metrics-server-594c959ccb-pdjc9\" (UID: \"81ec86e8-c0e4-4de3-89b8-6f4a08425ad4\") " pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" Apr 24 22:31:18.017297 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.017268 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/81ec86e8-c0e4-4de3-89b8-6f4a08425ad4-audit-log\") pod \"metrics-server-594c959ccb-pdjc9\" (UID: \"81ec86e8-c0e4-4de3-89b8-6f4a08425ad4\") " pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" Apr 24 22:31:18.017779 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.017756 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/81ec86e8-c0e4-4de3-89b8-6f4a08425ad4-metrics-server-audit-profiles\") pod \"metrics-server-594c959ccb-pdjc9\" (UID: \"81ec86e8-c0e4-4de3-89b8-6f4a08425ad4\") " pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" Apr 24 22:31:18.019303 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.019277 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/81ec86e8-c0e4-4de3-89b8-6f4a08425ad4-secret-metrics-server-client-certs\") pod \"metrics-server-594c959ccb-pdjc9\" (UID: \"81ec86e8-c0e4-4de3-89b8-6f4a08425ad4\") " pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" Apr 24 22:31:18.019404 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.019333 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/81ec86e8-c0e4-4de3-89b8-6f4a08425ad4-secret-metrics-server-tls\") pod \"metrics-server-594c959ccb-pdjc9\" (UID: \"81ec86e8-c0e4-4de3-89b8-6f4a08425ad4\") " pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" Apr 24 22:31:18.019404 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.019376 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ec86e8-c0e4-4de3-89b8-6f4a08425ad4-client-ca-bundle\") pod \"metrics-server-594c959ccb-pdjc9\" (UID: \"81ec86e8-c0e4-4de3-89b8-6f4a08425ad4\") " pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" Apr 24 22:31:18.019751 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.019730 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81ec86e8-c0e4-4de3-89b8-6f4a08425ad4-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-594c959ccb-pdjc9\" (UID: \"81ec86e8-c0e4-4de3-89b8-6f4a08425ad4\") " pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" Apr 24 22:31:18.026082 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.025959 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb24s\" (UniqueName: \"kubernetes.io/projected/81ec86e8-c0e4-4de3-89b8-6f4a08425ad4-kube-api-access-lb24s\") pod \"metrics-server-594c959ccb-pdjc9\" (UID: \"81ec86e8-c0e4-4de3-89b8-6f4a08425ad4\") " pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" Apr 24 22:31:18.169744 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.169722 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" Apr 24 22:31:18.290915 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.290888 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5dbff5548c-mbt58"] Apr 24 22:31:18.300757 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.300737 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:31:18.305497 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.305459 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5dbff5548c-mbt58"] Apr 24 22:31:18.313964 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.313936 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-594c959ccb-pdjc9"] Apr 24 22:31:18.422092 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.422060 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e439b2ed-4157-4b89-8773-c42bc87265e8-console-oauth-config\") pod \"console-5dbff5548c-mbt58\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:31:18.422193 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.422119 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e439b2ed-4157-4b89-8773-c42bc87265e8-oauth-serving-cert\") pod \"console-5dbff5548c-mbt58\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:31:18.422275 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.422248 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e439b2ed-4157-4b89-8773-c42bc87265e8-trusted-ca-bundle\") pod \"console-5dbff5548c-mbt58\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:31:18.422383 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.422288 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e439b2ed-4157-4b89-8773-c42bc87265e8-console-serving-cert\") pod \"console-5dbff5548c-mbt58\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:31:18.422383 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.422346 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e439b2ed-4157-4b89-8773-c42bc87265e8-service-ca\") pod \"console-5dbff5548c-mbt58\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:31:18.422383 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.422382 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e439b2ed-4157-4b89-8773-c42bc87265e8-console-config\") pod \"console-5dbff5548c-mbt58\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:31:18.422539 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.422446 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh2x2\" (UniqueName: \"kubernetes.io/projected/e439b2ed-4157-4b89-8773-c42bc87265e8-kube-api-access-zh2x2\") pod \"console-5dbff5548c-mbt58\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:31:18.523682 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.523655 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e439b2ed-4157-4b89-8773-c42bc87265e8-console-oauth-config\") pod \"console-5dbff5548c-mbt58\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:31:18.523797 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.523692 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e439b2ed-4157-4b89-8773-c42bc87265e8-oauth-serving-cert\") pod \"console-5dbff5548c-mbt58\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:31:18.523797 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.523744 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e439b2ed-4157-4b89-8773-c42bc87265e8-trusted-ca-bundle\") pod \"console-5dbff5548c-mbt58\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:31:18.523797 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.523773 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e439b2ed-4157-4b89-8773-c42bc87265e8-console-serving-cert\") pod \"console-5dbff5548c-mbt58\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:31:18.523946 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.523809 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e439b2ed-4157-4b89-8773-c42bc87265e8-service-ca\") pod \"console-5dbff5548c-mbt58\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:31:18.523946 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.523848 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e439b2ed-4157-4b89-8773-c42bc87265e8-console-config\") pod \"console-5dbff5548c-mbt58\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:31:18.523946 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.523889 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zh2x2\" (UniqueName: \"kubernetes.io/projected/e439b2ed-4157-4b89-8773-c42bc87265e8-kube-api-access-zh2x2\") pod \"console-5dbff5548c-mbt58\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:31:18.524547 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.524520 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e439b2ed-4157-4b89-8773-c42bc87265e8-service-ca\") pod \"console-5dbff5548c-mbt58\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:31:18.524955 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.524902 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e439b2ed-4157-4b89-8773-c42bc87265e8-console-config\") pod \"console-5dbff5548c-mbt58\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:31:18.525672 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.525628 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e439b2ed-4157-4b89-8773-c42bc87265e8-oauth-serving-cert\") pod \"console-5dbff5548c-mbt58\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:31:18.526406 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.526383 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e439b2ed-4157-4b89-8773-c42bc87265e8-trusted-ca-bundle\") pod \"console-5dbff5548c-mbt58\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:31:18.526672 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.526654 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e439b2ed-4157-4b89-8773-c42bc87265e8-console-serving-cert\") pod \"console-5dbff5548c-mbt58\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:31:18.528189 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.528173 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e439b2ed-4157-4b89-8773-c42bc87265e8-console-oauth-config\") pod \"console-5dbff5548c-mbt58\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:31:18.531398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.531381 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh2x2\" (UniqueName: \"kubernetes.io/projected/e439b2ed-4157-4b89-8773-c42bc87265e8-kube-api-access-zh2x2\") pod \"console-5dbff5548c-mbt58\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:31:18.620406 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.620363 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:31:18.734547 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.734527 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5dbff5548c-mbt58"] Apr 24 22:31:18.736733 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:31:18.736708 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode439b2ed_4157_4b89_8773_c42bc87265e8.slice/crio-1555fb7bef2f6216db1d539b9b1f2544490b4ed470ea1edb19dd5466a7442812 WatchSource:0}: Error finding container 1555fb7bef2f6216db1d539b9b1f2544490b4ed470ea1edb19dd5466a7442812: Status 404 returned error can't find the container with id 1555fb7bef2f6216db1d539b9b1f2544490b4ed470ea1edb19dd5466a7442812 Apr 24 22:31:18.781822 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.781705 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e44a6833-98fd-4227-8575-b155c7daa7df","Type":"ContainerStarted","Data":"bc8ee4e35be72241618311a3b2e7cd70b3d7455a69c4ee5948023dbe8b3bcf02"} Apr 24 22:31:18.781822 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.781745 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e44a6833-98fd-4227-8575-b155c7daa7df","Type":"ContainerStarted","Data":"4481cfd9dfdb8a85254b90fcdd9724f863c15b3bdf9e9341b4403b9e50575dbd"} Apr 24 22:31:18.781822 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.781761 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e44a6833-98fd-4227-8575-b155c7daa7df","Type":"ContainerStarted","Data":"2603c782584103c61cba65d0b2dc55eab8849c65f25aef495e559c987427ab42"} Apr 24 22:31:18.781822 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.781777 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e44a6833-98fd-4227-8575-b155c7daa7df","Type":"ContainerStarted","Data":"475ed0baf9df1765372eed7b1f0d0da9cd066631b00b981063815407b1012060"} Apr 24 22:31:18.781822 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.781789 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e44a6833-98fd-4227-8575-b155c7daa7df","Type":"ContainerStarted","Data":"5ef42db8d59710327be4c4dd0c1da06550f52f64b80aeb7ffd6cf4f6b4c4d3be"} Apr 24 22:31:18.785033 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.784589 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" event={"ID":"81ec86e8-c0e4-4de3-89b8-6f4a08425ad4","Type":"ContainerStarted","Data":"380bd2aae21531059656877742ce83f4f61d6b8026e382c7161c3538ce2d05d1"} Apr 24 22:31:18.786707 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:18.786679 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dbff5548c-mbt58" event={"ID":"e439b2ed-4157-4b89-8773-c42bc87265e8","Type":"ContainerStarted","Data":"1555fb7bef2f6216db1d539b9b1f2544490b4ed470ea1edb19dd5466a7442812"} Apr 24 22:31:19.793968 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.793348 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e44a6833-98fd-4227-8575-b155c7daa7df","Type":"ContainerStarted","Data":"4b09718ed76cc4828e8f7d76628b6203a7521e6c70d4e5a91c80f51817066b52"} Apr 24 22:31:19.795750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.795719 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" event={"ID":"81ec86e8-c0e4-4de3-89b8-6f4a08425ad4","Type":"ContainerStarted","Data":"01dd768a2b0fc6a036d77a2e9d3ec071889a294b5906e7383b5018712ea5b42e"} Apr 24 22:31:19.797226 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.797204 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dbff5548c-mbt58" event={"ID":"e439b2ed-4157-4b89-8773-c42bc87265e8","Type":"ContainerStarted","Data":"988a6deac8228acf783ee21f4ec57dedae0e14c0acb5e3bc2661aec3cc818a6e"} Apr 24 22:31:19.823843 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.823812 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 22:31:19.828054 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.828031 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:19.831130 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.830671 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 22:31:19.831130 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.830906 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-zz4m8\"" Apr 24 22:31:19.832547 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.831649 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 22:31:19.832547 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.831994 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 22:31:19.832547 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.832254 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 22:31:19.832547 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.832333 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 22:31:19.832547 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.832412 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 22:31:19.832547 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.832494 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-a99ofjqepu5h7\"" Apr 24 22:31:19.832921 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.832604 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 22:31:19.835338 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.835311 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 22:31:19.835587 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.835571 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 22:31:19.835813 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.835798 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 22:31:19.841571 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.840348 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.813914531 podStartE2EDuration="5.84033628s" podCreationTimestamp="2026-04-24 22:31:14 +0000 UTC" firstStartedPulling="2026-04-24 22:31:15.065147994 +0000 UTC m=+82.213453519" lastFinishedPulling="2026-04-24 22:31:19.091569742 +0000 UTC m=+86.239875268" observedRunningTime="2026-04-24 22:31:19.833325314 +0000 UTC m=+86.981630880" watchObservedRunningTime="2026-04-24 22:31:19.84033628 +0000 UTC m=+86.988641829" Apr 24 22:31:19.841571 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.841352 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 22:31:19.848249 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.848224 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 22:31:19.851411 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.851392 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 22:31:19.859058 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.858999 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" podStartSLOduration=1.50967181 podStartE2EDuration="2.858984052s" podCreationTimestamp="2026-04-24 22:31:17 +0000 UTC" firstStartedPulling="2026-04-24 22:31:18.32096315 +0000 UTC m=+85.469268677" lastFinishedPulling="2026-04-24 22:31:19.670275382 +0000 UTC m=+86.818580919" observedRunningTime="2026-04-24 22:31:19.857691956 +0000 UTC m=+87.005997505" watchObservedRunningTime="2026-04-24 22:31:19.858984052 +0000 UTC m=+87.007289603" Apr 24 22:31:19.878977 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.878928 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5dbff5548c-mbt58" podStartSLOduration=1.878910711 podStartE2EDuration="1.878910711s" podCreationTimestamp="2026-04-24 22:31:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:31:19.878523771 +0000 UTC m=+87.026829320" watchObservedRunningTime="2026-04-24 22:31:19.878910711 +0000 UTC m=+87.027216263" Apr 24 22:31:19.946580 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.946501 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4db0889d-c5f7-4627-b540-370829583e38-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:19.946580 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.946541 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4db0889d-c5f7-4627-b540-370829583e38-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:19.946580 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.946559 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db0889d-c5f7-4627-b540-370829583e38-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:19.946823 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.946607 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4db0889d-c5f7-4627-b540-370829583e38-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:19.946823 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.946696 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4db0889d-c5f7-4627-b540-370829583e38-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:19.946823 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.946722 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4db0889d-c5f7-4627-b540-370829583e38-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:19.946823 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.946754 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db0889d-c5f7-4627-b540-370829583e38-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:19.946823 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.946774 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4db0889d-c5f7-4627-b540-370829583e38-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:19.947109 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.946831 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4db0889d-c5f7-4627-b540-370829583e38-config-out\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:19.947109 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.946864 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4db0889d-c5f7-4627-b540-370829583e38-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:19.947109 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.946895 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db0889d-c5f7-4627-b540-370829583e38-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:19.947109 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.946914 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4db0889d-c5f7-4627-b540-370829583e38-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:19.947109 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.946945 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lqrw\" (UniqueName: \"kubernetes.io/projected/4db0889d-c5f7-4627-b540-370829583e38-kube-api-access-2lqrw\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:19.947109 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.946985 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4db0889d-c5f7-4627-b540-370829583e38-config\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:19.947109 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.947028 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4db0889d-c5f7-4627-b540-370829583e38-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:19.947109 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.947050 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4db0889d-c5f7-4627-b540-370829583e38-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:19.947109 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.947065 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4db0889d-c5f7-4627-b540-370829583e38-web-config\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:19.947109 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:19.947083 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4db0889d-c5f7-4627-b540-370829583e38-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.048548 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.048512 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4db0889d-c5f7-4627-b540-370829583e38-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.048679 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.048554 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4db0889d-c5f7-4627-b540-370829583e38-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.048679 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.048590 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db0889d-c5f7-4627-b540-370829583e38-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.048679 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.048616 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4db0889d-c5f7-4627-b540-370829583e38-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.048679 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.048645 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4db0889d-c5f7-4627-b540-370829583e38-config-out\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.048679 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.048675 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4db0889d-c5f7-4627-b540-370829583e38-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.048919 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.048732 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db0889d-c5f7-4627-b540-370829583e38-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.048919 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.048779 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4db0889d-c5f7-4627-b540-370829583e38-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.048919 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.048825 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2lqrw\" (UniqueName: \"kubernetes.io/projected/4db0889d-c5f7-4627-b540-370829583e38-kube-api-access-2lqrw\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.048919 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.048858 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4db0889d-c5f7-4627-b540-370829583e38-config\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.048919 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.048888 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4db0889d-c5f7-4627-b540-370829583e38-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.048919 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.048913 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4db0889d-c5f7-4627-b540-370829583e38-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.049240 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.048935 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4db0889d-c5f7-4627-b540-370829583e38-web-config\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.049240 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.048969 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4db0889d-c5f7-4627-b540-370829583e38-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.049240 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.049000 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4db0889d-c5f7-4627-b540-370829583e38-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.049240 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.049052 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4db0889d-c5f7-4627-b540-370829583e38-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.049240 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.049076 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db0889d-c5f7-4627-b540-370829583e38-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.049240 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.049108 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4db0889d-c5f7-4627-b540-370829583e38-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.049528 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.049507 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db0889d-c5f7-4627-b540-370829583e38-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.049668 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.049643 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db0889d-c5f7-4627-b540-370829583e38-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.049829 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.049807 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4db0889d-c5f7-4627-b540-370829583e38-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.052671 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.052350 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4db0889d-c5f7-4627-b540-370829583e38-web-config\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.052671 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.052502 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4db0889d-c5f7-4627-b540-370829583e38-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.052824 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.052723 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4db0889d-c5f7-4627-b540-370829583e38-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.053226 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.053204 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4db0889d-c5f7-4627-b540-370829583e38-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.054797 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.053665 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db0889d-c5f7-4627-b540-370829583e38-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.054797 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.053757 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4db0889d-c5f7-4627-b540-370829583e38-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.054797 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.054455 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4db0889d-c5f7-4627-b540-370829583e38-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.054797 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.054748 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4db0889d-c5f7-4627-b540-370829583e38-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.054797 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.054771 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4db0889d-c5f7-4627-b540-370829583e38-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.055580 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.055540 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4db0889d-c5f7-4627-b540-370829583e38-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.056003 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.055980 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4db0889d-c5f7-4627-b540-370829583e38-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.056102 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.056081 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4db0889d-c5f7-4627-b540-370829583e38-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.057757 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.057721 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4db0889d-c5f7-4627-b540-370829583e38-config\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.058118 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.058097 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4db0889d-c5f7-4627-b540-370829583e38-config-out\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.058921 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.058885 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lqrw\" (UniqueName: \"kubernetes.io/projected/4db0889d-c5f7-4627-b540-370829583e38-kube-api-access-2lqrw\") pod \"prometheus-k8s-0\" (UID: \"4db0889d-c5f7-4627-b540-370829583e38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.148923 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.148893 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:20.294226 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.294200 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 22:31:20.801665 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.801634 2572 generic.go:358] "Generic (PLEG): container finished" podID="4db0889d-c5f7-4627-b540-370829583e38" containerID="6b8122a8350814bece8915fa90589f32526324b46fb74f1891e0559af7a2acfe" exitCode=0 Apr 24 22:31:20.802062 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.801761 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4db0889d-c5f7-4627-b540-370829583e38","Type":"ContainerDied","Data":"6b8122a8350814bece8915fa90589f32526324b46fb74f1891e0559af7a2acfe"} Apr 24 22:31:20.802062 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:20.801795 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4db0889d-c5f7-4627-b540-370829583e38","Type":"ContainerStarted","Data":"e9b689de69de53a380a7d43bd90007656c79775d750bc99885aa41dd9d5721b7"} Apr 24 22:31:23.813143 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:23.813109 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4db0889d-c5f7-4627-b540-370829583e38","Type":"ContainerStarted","Data":"41f7fd50775462951f9bdb51519510e350b342d16e9d82a779f1fbcf867f443b"} Apr 24 22:31:23.813143 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:23.813145 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4db0889d-c5f7-4627-b540-370829583e38","Type":"ContainerStarted","Data":"974569009bd23666be93894d7abe8d27ef3efba4658dbd5d52d5d0e8e039a34b"} Apr 24 22:31:25.827256 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:25.827224 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4db0889d-c5f7-4627-b540-370829583e38","Type":"ContainerStarted","Data":"0bf86a38860145f7a27e0e9b9a6e9992bcd8a0e2aaf0e119e6e903a716b2d0c7"} Apr 24 22:31:25.827256 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:25.827257 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4db0889d-c5f7-4627-b540-370829583e38","Type":"ContainerStarted","Data":"e3ea5a5b462d3d4e4e7c905ad81dd42b5382238d9e5a54f1fea4b0f735e84edb"} Apr 24 22:31:25.827646 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:25.827267 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4db0889d-c5f7-4627-b540-370829583e38","Type":"ContainerStarted","Data":"ec6ee66954b1fb706386790b976fc7084b980d22be104c389ddb0cb9da6695a8"} Apr 24 22:31:25.827646 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:25.827277 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4db0889d-c5f7-4627-b540-370829583e38","Type":"ContainerStarted","Data":"65ef2b93fd8e100916b4f365df34b73fb39806de16b8d673515ea1840716b702"} Apr 24 22:31:25.858574 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:25.858523 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.555072501 podStartE2EDuration="6.858505808s" podCreationTimestamp="2026-04-24 22:31:19 +0000 UTC" firstStartedPulling="2026-04-24 22:31:20.803023987 +0000 UTC m=+87.951329527" lastFinishedPulling="2026-04-24 22:31:25.106457308 +0000 UTC m=+92.254762834" observedRunningTime="2026-04-24 22:31:25.856090304 +0000 UTC m=+93.004395855" watchObservedRunningTime="2026-04-24 22:31:25.858505808 +0000 UTC m=+93.006811488" Apr 24 22:31:26.790342 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:26.790271 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-d49f586d7-2wfr7" podUID="749d05ed-a876-4e79-a759-a5f750366c90" containerName="console" containerID="cri-o://7bac731336e4c928c35107a66f027227fc31f9180ce235550c3efe9fcb581069" gracePeriod=15 Apr 24 22:31:27.044247 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.044196 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d49f586d7-2wfr7_749d05ed-a876-4e79-a759-a5f750366c90/console/0.log" Apr 24 22:31:27.044515 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.044265 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d49f586d7-2wfr7" Apr 24 22:31:27.115622 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.115589 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/749d05ed-a876-4e79-a759-a5f750366c90-service-ca\") pod \"749d05ed-a876-4e79-a759-a5f750366c90\" (UID: \"749d05ed-a876-4e79-a759-a5f750366c90\") " Apr 24 22:31:27.115769 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.115629 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/749d05ed-a876-4e79-a759-a5f750366c90-oauth-serving-cert\") pod \"749d05ed-a876-4e79-a759-a5f750366c90\" (UID: \"749d05ed-a876-4e79-a759-a5f750366c90\") " Apr 24 22:31:27.115769 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.115646 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/749d05ed-a876-4e79-a759-a5f750366c90-console-serving-cert\") pod \"749d05ed-a876-4e79-a759-a5f750366c90\" (UID: \"749d05ed-a876-4e79-a759-a5f750366c90\") " Apr 24 22:31:27.115769 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.115682 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bj56d\" (UniqueName: \"kubernetes.io/projected/749d05ed-a876-4e79-a759-a5f750366c90-kube-api-access-bj56d\") pod \"749d05ed-a876-4e79-a759-a5f750366c90\" (UID: \"749d05ed-a876-4e79-a759-a5f750366c90\") " Apr 24 22:31:27.115909 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.115772 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/749d05ed-a876-4e79-a759-a5f750366c90-console-oauth-config\") pod \"749d05ed-a876-4e79-a759-a5f750366c90\" (UID: \"749d05ed-a876-4e79-a759-a5f750366c90\") " Apr 24 22:31:27.115909 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.115835 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/749d05ed-a876-4e79-a759-a5f750366c90-console-config\") pod \"749d05ed-a876-4e79-a759-a5f750366c90\" (UID: \"749d05ed-a876-4e79-a759-a5f750366c90\") " Apr 24 22:31:27.116000 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.115971 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/749d05ed-a876-4e79-a759-a5f750366c90-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "749d05ed-a876-4e79-a759-a5f750366c90" (UID: "749d05ed-a876-4e79-a759-a5f750366c90"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:31:27.116000 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.115982 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/749d05ed-a876-4e79-a759-a5f750366c90-service-ca" (OuterVolumeSpecName: "service-ca") pod "749d05ed-a876-4e79-a759-a5f750366c90" (UID: "749d05ed-a876-4e79-a759-a5f750366c90"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:31:27.116138 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.116099 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/749d05ed-a876-4e79-a759-a5f750366c90-service-ca\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:31:27.116138 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.116114 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/749d05ed-a876-4e79-a759-a5f750366c90-oauth-serving-cert\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:31:27.116256 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.116227 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/749d05ed-a876-4e79-a759-a5f750366c90-console-config" (OuterVolumeSpecName: "console-config") pod "749d05ed-a876-4e79-a759-a5f750366c90" (UID: "749d05ed-a876-4e79-a759-a5f750366c90"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:31:27.117889 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.117860 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/749d05ed-a876-4e79-a759-a5f750366c90-kube-api-access-bj56d" (OuterVolumeSpecName: "kube-api-access-bj56d") pod "749d05ed-a876-4e79-a759-a5f750366c90" (UID: "749d05ed-a876-4e79-a759-a5f750366c90"). InnerVolumeSpecName "kube-api-access-bj56d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:31:27.117889 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.117873 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/749d05ed-a876-4e79-a759-a5f750366c90-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "749d05ed-a876-4e79-a759-a5f750366c90" (UID: "749d05ed-a876-4e79-a759-a5f750366c90"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:31:27.118040 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.117890 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/749d05ed-a876-4e79-a759-a5f750366c90-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "749d05ed-a876-4e79-a759-a5f750366c90" (UID: "749d05ed-a876-4e79-a759-a5f750366c90"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:31:27.217414 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.217391 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bj56d\" (UniqueName: \"kubernetes.io/projected/749d05ed-a876-4e79-a759-a5f750366c90-kube-api-access-bj56d\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:31:27.217414 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.217412 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/749d05ed-a876-4e79-a759-a5f750366c90-console-oauth-config\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:31:27.217518 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.217422 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/749d05ed-a876-4e79-a759-a5f750366c90-console-config\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:31:27.217518 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.217431 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/749d05ed-a876-4e79-a759-a5f750366c90-console-serving-cert\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:31:27.834409 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.834386 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d49f586d7-2wfr7_749d05ed-a876-4e79-a759-a5f750366c90/console/0.log" Apr 24 22:31:27.834518 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.834425 2572 generic.go:358] "Generic (PLEG): container finished" podID="749d05ed-a876-4e79-a759-a5f750366c90" containerID="7bac731336e4c928c35107a66f027227fc31f9180ce235550c3efe9fcb581069" exitCode=2 Apr 24 22:31:27.834518 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.834477 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d49f586d7-2wfr7" event={"ID":"749d05ed-a876-4e79-a759-a5f750366c90","Type":"ContainerDied","Data":"7bac731336e4c928c35107a66f027227fc31f9180ce235550c3efe9fcb581069"} Apr 24 22:31:27.834518 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.834515 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d49f586d7-2wfr7" event={"ID":"749d05ed-a876-4e79-a759-a5f750366c90","Type":"ContainerDied","Data":"140e908c61a172337057487326c1b6b30da0dbb762946168bbd6d2b9f8db7833"} Apr 24 22:31:27.834632 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.834517 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d49f586d7-2wfr7" Apr 24 22:31:27.834632 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.834532 2572 scope.go:117] "RemoveContainer" containerID="7bac731336e4c928c35107a66f027227fc31f9180ce235550c3efe9fcb581069" Apr 24 22:31:27.843301 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.843280 2572 scope.go:117] "RemoveContainer" containerID="7bac731336e4c928c35107a66f027227fc31f9180ce235550c3efe9fcb581069" Apr 24 22:31:27.843562 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:31:27.843541 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bac731336e4c928c35107a66f027227fc31f9180ce235550c3efe9fcb581069\": container with ID starting with 7bac731336e4c928c35107a66f027227fc31f9180ce235550c3efe9fcb581069 not found: ID does not exist" containerID="7bac731336e4c928c35107a66f027227fc31f9180ce235550c3efe9fcb581069" Apr 24 22:31:27.843622 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.843572 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bac731336e4c928c35107a66f027227fc31f9180ce235550c3efe9fcb581069"} err="failed to get container status \"7bac731336e4c928c35107a66f027227fc31f9180ce235550c3efe9fcb581069\": rpc error: code = NotFound desc = could not find container \"7bac731336e4c928c35107a66f027227fc31f9180ce235550c3efe9fcb581069\": container with ID starting with 7bac731336e4c928c35107a66f027227fc31f9180ce235550c3efe9fcb581069 not found: ID does not exist" Apr 24 22:31:27.850775 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.850749 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d49f586d7-2wfr7"] Apr 24 22:31:27.855148 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:27.855123 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-d49f586d7-2wfr7"] Apr 24 22:31:28.620802 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:28.620774 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:31:28.621215 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:28.621057 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:31:28.626050 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:28.626029 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:31:28.842023 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:28.841988 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:31:28.884603 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:28.884539 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-fcb68766d-9t7jp"] Apr 24 22:31:29.418076 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:29.418038 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="749d05ed-a876-4e79-a759-a5f750366c90" path="/var/lib/kubelet/pods/749d05ed-a876-4e79-a759-a5f750366c90/volumes" Apr 24 22:31:30.149317 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:30.149290 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:31:34.712156 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:34.712118 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-6q8zg" Apr 24 22:31:38.170197 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:38.170160 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" Apr 24 22:31:38.170578 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:38.170235 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" Apr 24 22:31:53.904212 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:53.904166 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-fcb68766d-9t7jp" podUID="a8437dad-a272-4baa-915e-fa0d39fb9fd2" containerName="console" containerID="cri-o://0c858f4653bf39f4b7280e210fc8d9d6af7a985634c95f76efc072ad24cc886b" gracePeriod=15 Apr 24 22:31:54.143061 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.143040 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-fcb68766d-9t7jp_a8437dad-a272-4baa-915e-fa0d39fb9fd2/console/0.log" Apr 24 22:31:54.143160 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.143101 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:31:54.301133 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.301105 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8437dad-a272-4baa-915e-fa0d39fb9fd2-console-serving-cert\") pod \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " Apr 24 22:31:54.301257 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.301183 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8437dad-a272-4baa-915e-fa0d39fb9fd2-trusted-ca-bundle\") pod \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " Apr 24 22:31:54.301318 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.301285 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a8437dad-a272-4baa-915e-fa0d39fb9fd2-console-config\") pod \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " Apr 24 22:31:54.301374 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.301352 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v6l7\" (UniqueName: \"kubernetes.io/projected/a8437dad-a272-4baa-915e-fa0d39fb9fd2-kube-api-access-2v6l7\") pod \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " Apr 24 22:31:54.301429 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.301394 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a8437dad-a272-4baa-915e-fa0d39fb9fd2-console-oauth-config\") pod \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " Apr 24 22:31:54.301484 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.301434 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8437dad-a272-4baa-915e-fa0d39fb9fd2-service-ca\") pod \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " Apr 24 22:31:54.301484 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.301475 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a8437dad-a272-4baa-915e-fa0d39fb9fd2-oauth-serving-cert\") pod \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\" (UID: \"a8437dad-a272-4baa-915e-fa0d39fb9fd2\") " Apr 24 22:31:54.301581 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.301564 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8437dad-a272-4baa-915e-fa0d39fb9fd2-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a8437dad-a272-4baa-915e-fa0d39fb9fd2" (UID: "a8437dad-a272-4baa-915e-fa0d39fb9fd2"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:31:54.301848 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.301812 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8437dad-a272-4baa-915e-fa0d39fb9fd2-console-config" (OuterVolumeSpecName: "console-config") pod "a8437dad-a272-4baa-915e-fa0d39fb9fd2" (UID: "a8437dad-a272-4baa-915e-fa0d39fb9fd2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:31:54.301848 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.301829 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8437dad-a272-4baa-915e-fa0d39fb9fd2-trusted-ca-bundle\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:31:54.301973 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.301883 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8437dad-a272-4baa-915e-fa0d39fb9fd2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a8437dad-a272-4baa-915e-fa0d39fb9fd2" (UID: "a8437dad-a272-4baa-915e-fa0d39fb9fd2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:31:54.301973 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.301910 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8437dad-a272-4baa-915e-fa0d39fb9fd2-service-ca" (OuterVolumeSpecName: "service-ca") pod "a8437dad-a272-4baa-915e-fa0d39fb9fd2" (UID: "a8437dad-a272-4baa-915e-fa0d39fb9fd2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:31:54.303404 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.303378 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8437dad-a272-4baa-915e-fa0d39fb9fd2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a8437dad-a272-4baa-915e-fa0d39fb9fd2" (UID: "a8437dad-a272-4baa-915e-fa0d39fb9fd2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:31:54.303654 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.303633 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8437dad-a272-4baa-915e-fa0d39fb9fd2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a8437dad-a272-4baa-915e-fa0d39fb9fd2" (UID: "a8437dad-a272-4baa-915e-fa0d39fb9fd2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:31:54.303732 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.303707 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8437dad-a272-4baa-915e-fa0d39fb9fd2-kube-api-access-2v6l7" (OuterVolumeSpecName: "kube-api-access-2v6l7") pod "a8437dad-a272-4baa-915e-fa0d39fb9fd2" (UID: "a8437dad-a272-4baa-915e-fa0d39fb9fd2"). InnerVolumeSpecName "kube-api-access-2v6l7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:31:54.402567 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.402540 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2v6l7\" (UniqueName: \"kubernetes.io/projected/a8437dad-a272-4baa-915e-fa0d39fb9fd2-kube-api-access-2v6l7\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:31:54.402567 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.402564 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a8437dad-a272-4baa-915e-fa0d39fb9fd2-console-oauth-config\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:31:54.402699 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.402579 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8437dad-a272-4baa-915e-fa0d39fb9fd2-service-ca\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:31:54.402699 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.402591 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a8437dad-a272-4baa-915e-fa0d39fb9fd2-oauth-serving-cert\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:31:54.402699 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.402603 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8437dad-a272-4baa-915e-fa0d39fb9fd2-console-serving-cert\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:31:54.402699 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.402616 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a8437dad-a272-4baa-915e-fa0d39fb9fd2-console-config\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:31:54.915234 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.915211 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-fcb68766d-9t7jp_a8437dad-a272-4baa-915e-fa0d39fb9fd2/console/0.log" Apr 24 22:31:54.915632 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.915259 2572 generic.go:358] "Generic (PLEG): container finished" podID="a8437dad-a272-4baa-915e-fa0d39fb9fd2" containerID="0c858f4653bf39f4b7280e210fc8d9d6af7a985634c95f76efc072ad24cc886b" exitCode=2 Apr 24 22:31:54.915632 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.915340 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fcb68766d-9t7jp" Apr 24 22:31:54.915632 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.915358 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fcb68766d-9t7jp" event={"ID":"a8437dad-a272-4baa-915e-fa0d39fb9fd2","Type":"ContainerDied","Data":"0c858f4653bf39f4b7280e210fc8d9d6af7a985634c95f76efc072ad24cc886b"} Apr 24 22:31:54.915632 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.915410 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fcb68766d-9t7jp" event={"ID":"a8437dad-a272-4baa-915e-fa0d39fb9fd2","Type":"ContainerDied","Data":"16ec3f3c0d98f6ec571667860bf67d2dbe6f554bf8898f3bfcafa70dfa10452f"} Apr 24 22:31:54.915632 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.915427 2572 scope.go:117] "RemoveContainer" containerID="0c858f4653bf39f4b7280e210fc8d9d6af7a985634c95f76efc072ad24cc886b" Apr 24 22:31:54.924177 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.924152 2572 scope.go:117] "RemoveContainer" containerID="0c858f4653bf39f4b7280e210fc8d9d6af7a985634c95f76efc072ad24cc886b" Apr 24 22:31:54.924447 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:31:54.924422 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c858f4653bf39f4b7280e210fc8d9d6af7a985634c95f76efc072ad24cc886b\": container with ID starting with 0c858f4653bf39f4b7280e210fc8d9d6af7a985634c95f76efc072ad24cc886b not found: ID does not exist" containerID="0c858f4653bf39f4b7280e210fc8d9d6af7a985634c95f76efc072ad24cc886b" Apr 24 22:31:54.924508 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.924454 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c858f4653bf39f4b7280e210fc8d9d6af7a985634c95f76efc072ad24cc886b"} err="failed to get container status \"0c858f4653bf39f4b7280e210fc8d9d6af7a985634c95f76efc072ad24cc886b\": rpc error: code = NotFound desc = could not find container \"0c858f4653bf39f4b7280e210fc8d9d6af7a985634c95f76efc072ad24cc886b\": container with ID starting with 0c858f4653bf39f4b7280e210fc8d9d6af7a985634c95f76efc072ad24cc886b not found: ID does not exist" Apr 24 22:31:54.936072 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.936036 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-fcb68766d-9t7jp"] Apr 24 22:31:54.939730 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:54.939700 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-fcb68766d-9t7jp"] Apr 24 22:31:55.418161 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:55.418131 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8437dad-a272-4baa-915e-fa0d39fb9fd2" path="/var/lib/kubelet/pods/a8437dad-a272-4baa-915e-fa0d39fb9fd2/volumes" Apr 24 22:31:58.175600 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:58.175567 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" Apr 24 22:31:58.179736 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:31:58.179705 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-594c959ccb-pdjc9" Apr 24 22:32:20.149975 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:20.149935 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:20.171494 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:20.171470 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:21.008612 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:21.008587 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:37.911732 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:37.911689 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-685fd5dc87-ngx4q"] Apr 24 22:32:37.912242 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:37.912050 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8437dad-a272-4baa-915e-fa0d39fb9fd2" containerName="console" Apr 24 22:32:37.912242 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:37.912067 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8437dad-a272-4baa-915e-fa0d39fb9fd2" containerName="console" Apr 24 22:32:37.912242 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:37.912094 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="749d05ed-a876-4e79-a759-a5f750366c90" containerName="console" Apr 24 22:32:37.912242 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:37.912102 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="749d05ed-a876-4e79-a759-a5f750366c90" containerName="console" Apr 24 22:32:37.912242 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:37.912177 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8437dad-a272-4baa-915e-fa0d39fb9fd2" containerName="console" Apr 24 22:32:37.912242 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:37.912191 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="749d05ed-a876-4e79-a759-a5f750366c90" containerName="console" Apr 24 22:32:37.915792 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:37.915761 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" Apr 24 22:32:37.919424 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:37.919402 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 24 22:32:37.919424 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:37.919402 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 24 22:32:37.919602 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:37.919457 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 24 22:32:37.919744 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:37.919721 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 24 22:32:37.920343 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:37.920316 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 24 22:32:37.920343 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:37.920333 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-hx2f5\"" Apr 24 22:32:37.931657 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:37.931632 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 24 22:32:37.934190 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:37.933065 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-685fd5dc87-ngx4q"] Apr 24 22:32:38.034891 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:38.034818 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1-metrics-client-ca\") pod \"telemeter-client-685fd5dc87-ngx4q\" (UID: \"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1\") " pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" Apr 24 22:32:38.034891 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:38.034874 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1-secret-telemeter-client\") pod \"telemeter-client-685fd5dc87-ngx4q\" (UID: \"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1\") " pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" Apr 24 22:32:38.035110 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:38.034924 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9xws\" (UniqueName: \"kubernetes.io/projected/6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1-kube-api-access-w9xws\") pod \"telemeter-client-685fd5dc87-ngx4q\" (UID: \"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1\") " pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" Apr 24 22:32:38.035110 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:38.034951 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-685fd5dc87-ngx4q\" (UID: \"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1\") " pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" Apr 24 22:32:38.035110 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:38.035035 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1-serving-certs-ca-bundle\") pod \"telemeter-client-685fd5dc87-ngx4q\" (UID: \"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1\") " pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" Apr 24 22:32:38.035110 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:38.035062 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1-federate-client-tls\") pod \"telemeter-client-685fd5dc87-ngx4q\" (UID: \"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1\") " pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" Apr 24 22:32:38.035110 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:38.035093 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1-telemeter-client-tls\") pod \"telemeter-client-685fd5dc87-ngx4q\" (UID: \"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1\") " pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" Apr 24 22:32:38.035311 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:38.035162 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-685fd5dc87-ngx4q\" (UID: \"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1\") " pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" Apr 24 22:32:38.135605 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:38.135576 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1-secret-telemeter-client\") pod \"telemeter-client-685fd5dc87-ngx4q\" (UID: \"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1\") " pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" Apr 24 22:32:38.135711 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:38.135607 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9xws\" (UniqueName: \"kubernetes.io/projected/6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1-kube-api-access-w9xws\") pod \"telemeter-client-685fd5dc87-ngx4q\" (UID: \"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1\") " pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" Apr 24 22:32:38.135711 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:38.135626 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-685fd5dc87-ngx4q\" (UID: \"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1\") " pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" Apr 24 22:32:38.135930 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:38.135910 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1-serving-certs-ca-bundle\") pod \"telemeter-client-685fd5dc87-ngx4q\" (UID: \"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1\") " pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" Apr 24 22:32:38.135991 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:38.135949 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1-federate-client-tls\") pod \"telemeter-client-685fd5dc87-ngx4q\" (UID: \"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1\") " pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" Apr 24 22:32:38.135991 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:38.135985 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1-telemeter-client-tls\") pod \"telemeter-client-685fd5dc87-ngx4q\" (UID: \"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1\") " pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" Apr 24 22:32:38.136126 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:38.136061 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-685fd5dc87-ngx4q\" (UID: \"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1\") " pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" Apr 24 22:32:38.136126 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:38.136112 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1-metrics-client-ca\") pod \"telemeter-client-685fd5dc87-ngx4q\" (UID: \"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1\") " pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" Apr 24 22:32:38.136645 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:38.136592 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1-serving-certs-ca-bundle\") pod \"telemeter-client-685fd5dc87-ngx4q\" (UID: \"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1\") " pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" Apr 24 22:32:38.136760 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:38.136730 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1-metrics-client-ca\") pod \"telemeter-client-685fd5dc87-ngx4q\" (UID: \"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1\") " pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" Apr 24 22:32:38.136822 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:38.136759 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-685fd5dc87-ngx4q\" (UID: \"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1\") " pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" Apr 24 22:32:38.138327 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:38.138299 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1-secret-telemeter-client\") pod \"telemeter-client-685fd5dc87-ngx4q\" (UID: \"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1\") " pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" Apr 24 22:32:38.138476 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:38.138448 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-685fd5dc87-ngx4q\" (UID: \"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1\") " pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" Apr 24 22:32:38.138535 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:38.138519 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1-federate-client-tls\") pod \"telemeter-client-685fd5dc87-ngx4q\" (UID: \"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1\") " pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" Apr 24 22:32:38.138568 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:38.138530 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1-telemeter-client-tls\") pod \"telemeter-client-685fd5dc87-ngx4q\" (UID: \"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1\") " pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" Apr 24 22:32:38.145230 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:38.145211 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9xws\" (UniqueName: \"kubernetes.io/projected/6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1-kube-api-access-w9xws\") pod \"telemeter-client-685fd5dc87-ngx4q\" (UID: \"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1\") " pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" Apr 24 22:32:38.233959 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:38.233936 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" Apr 24 22:32:38.353729 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:38.353707 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-685fd5dc87-ngx4q"] Apr 24 22:32:38.355544 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:32:38.355516 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c6fbaa0_6698_42d2_90c3_3913ea9e8ea1.slice/crio-289d162ab3b3444022c1069eb8e8b8a3bba300f31bb08f5e148794d0524d9084 WatchSource:0}: Error finding container 289d162ab3b3444022c1069eb8e8b8a3bba300f31bb08f5e148794d0524d9084: Status 404 returned error can't find the container with id 289d162ab3b3444022c1069eb8e8b8a3bba300f31bb08f5e148794d0524d9084 Apr 24 22:32:39.046685 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:39.046653 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" event={"ID":"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1","Type":"ContainerStarted","Data":"289d162ab3b3444022c1069eb8e8b8a3bba300f31bb08f5e148794d0524d9084"} Apr 24 22:32:40.051770 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:40.051746 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" event={"ID":"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1","Type":"ContainerStarted","Data":"f8de6664724331058263d01d5630cea583103bbb101eb4ddd3a6833548be9458"} Apr 24 22:32:40.052061 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:40.051779 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" event={"ID":"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1","Type":"ContainerStarted","Data":"cebadc29e2de7ee8e9b04b2c4752d52f301296b5b446ba4171172a66c728ad3a"} Apr 24 22:32:41.056784 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:41.056750 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" event={"ID":"6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1","Type":"ContainerStarted","Data":"7601d595051b8b10fac795b49d852a013ffd863bd8dafb08da834da78f8490d8"} Apr 24 22:32:41.080683 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:41.080630 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-685fd5dc87-ngx4q" podStartSLOduration=2.554541983 podStartE2EDuration="4.080615089s" podCreationTimestamp="2026-04-24 22:32:37 +0000 UTC" firstStartedPulling="2026-04-24 22:32:38.357804786 +0000 UTC m=+165.506110315" lastFinishedPulling="2026-04-24 22:32:39.883877895 +0000 UTC m=+167.032183421" observedRunningTime="2026-04-24 22:32:41.079767839 +0000 UTC m=+168.228073444" watchObservedRunningTime="2026-04-24 22:32:41.080615089 +0000 UTC m=+168.228920637" Apr 24 22:32:41.691140 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:41.691107 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5b97b87699-44kpl"] Apr 24 22:32:41.694447 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:41.694422 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b97b87699-44kpl" Apr 24 22:32:41.704912 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:41.704886 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b97b87699-44kpl"] Apr 24 22:32:41.767492 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:41.767458 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-console-serving-cert\") pod \"console-5b97b87699-44kpl\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " pod="openshift-console/console-5b97b87699-44kpl" Apr 24 22:32:41.767571 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:41.767509 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-trusted-ca-bundle\") pod \"console-5b97b87699-44kpl\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " pod="openshift-console/console-5b97b87699-44kpl" Apr 24 22:32:41.767623 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:41.767594 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6lfw\" (UniqueName: \"kubernetes.io/projected/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-kube-api-access-z6lfw\") pod \"console-5b97b87699-44kpl\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " pod="openshift-console/console-5b97b87699-44kpl" Apr 24 22:32:41.767658 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:41.767634 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-oauth-serving-cert\") pod \"console-5b97b87699-44kpl\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " pod="openshift-console/console-5b97b87699-44kpl" Apr 24 22:32:41.767691 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:41.767672 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-console-config\") pod \"console-5b97b87699-44kpl\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " pod="openshift-console/console-5b97b87699-44kpl" Apr 24 22:32:41.767725 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:41.767707 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-console-oauth-config\") pod \"console-5b97b87699-44kpl\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " pod="openshift-console/console-5b97b87699-44kpl" Apr 24 22:32:41.767725 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:41.767723 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-service-ca\") pod \"console-5b97b87699-44kpl\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " pod="openshift-console/console-5b97b87699-44kpl" Apr 24 22:32:41.868138 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:41.868109 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-oauth-serving-cert\") pod \"console-5b97b87699-44kpl\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " pod="openshift-console/console-5b97b87699-44kpl" Apr 24 22:32:41.868240 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:41.868147 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-console-config\") pod \"console-5b97b87699-44kpl\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " pod="openshift-console/console-5b97b87699-44kpl" Apr 24 22:32:41.868240 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:41.868172 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-console-oauth-config\") pod \"console-5b97b87699-44kpl\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " pod="openshift-console/console-5b97b87699-44kpl" Apr 24 22:32:41.868240 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:41.868189 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-service-ca\") pod \"console-5b97b87699-44kpl\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " pod="openshift-console/console-5b97b87699-44kpl" Apr 24 22:32:41.868240 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:41.868230 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-console-serving-cert\") pod \"console-5b97b87699-44kpl\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " pod="openshift-console/console-5b97b87699-44kpl" Apr 24 22:32:41.868428 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:41.868262 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-trusted-ca-bundle\") pod \"console-5b97b87699-44kpl\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " pod="openshift-console/console-5b97b87699-44kpl" Apr 24 22:32:41.868428 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:41.868308 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6lfw\" (UniqueName: \"kubernetes.io/projected/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-kube-api-access-z6lfw\") pod \"console-5b97b87699-44kpl\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " pod="openshift-console/console-5b97b87699-44kpl" Apr 24 22:32:41.869007 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:41.868978 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-console-config\") pod \"console-5b97b87699-44kpl\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " pod="openshift-console/console-5b97b87699-44kpl" Apr 24 22:32:41.869140 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:41.868978 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-oauth-serving-cert\") pod \"console-5b97b87699-44kpl\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " pod="openshift-console/console-5b97b87699-44kpl" Apr 24 22:32:41.869140 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:41.868999 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-service-ca\") pod \"console-5b97b87699-44kpl\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " pod="openshift-console/console-5b97b87699-44kpl" Apr 24 22:32:41.869140 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:41.869130 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-trusted-ca-bundle\") pod \"console-5b97b87699-44kpl\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " pod="openshift-console/console-5b97b87699-44kpl" Apr 24 22:32:41.870754 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:41.870727 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-console-serving-cert\") pod \"console-5b97b87699-44kpl\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " pod="openshift-console/console-5b97b87699-44kpl" Apr 24 22:32:41.870838 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:41.870753 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-console-oauth-config\") pod \"console-5b97b87699-44kpl\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " pod="openshift-console/console-5b97b87699-44kpl" Apr 24 22:32:41.877459 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:41.877440 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6lfw\" (UniqueName: \"kubernetes.io/projected/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-kube-api-access-z6lfw\") pod \"console-5b97b87699-44kpl\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " pod="openshift-console/console-5b97b87699-44kpl" Apr 24 22:32:42.004912 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:42.004881 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b97b87699-44kpl" Apr 24 22:32:42.140776 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:42.140729 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b97b87699-44kpl"] Apr 24 22:32:42.143657 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:32:42.143626 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fa15ea2_eb77_4643_8f97_5b6eff0695eb.slice/crio-516be2588f578d34f33e49ef5af3dd3db7f6896ab416e8c6027ae928a2960e1b WatchSource:0}: Error finding container 516be2588f578d34f33e49ef5af3dd3db7f6896ab416e8c6027ae928a2960e1b: Status 404 returned error can't find the container with id 516be2588f578d34f33e49ef5af3dd3db7f6896ab416e8c6027ae928a2960e1b Apr 24 22:32:43.065171 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:43.065123 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b97b87699-44kpl" event={"ID":"8fa15ea2-eb77-4643-8f97-5b6eff0695eb","Type":"ContainerStarted","Data":"c8c52e0adfd5166d9e3194f90fb0d6a8e3b4aa43108ab84dd5a75c9132d8f88f"} Apr 24 22:32:43.065171 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:43.065170 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b97b87699-44kpl" event={"ID":"8fa15ea2-eb77-4643-8f97-5b6eff0695eb","Type":"ContainerStarted","Data":"516be2588f578d34f33e49ef5af3dd3db7f6896ab416e8c6027ae928a2960e1b"} Apr 24 22:32:43.085842 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:43.085790 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5b97b87699-44kpl" podStartSLOduration=2.08577646 podStartE2EDuration="2.08577646s" podCreationTimestamp="2026-04-24 22:32:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:32:43.083391604 +0000 UTC m=+170.231697174" watchObservedRunningTime="2026-04-24 22:32:43.08577646 +0000 UTC m=+170.234082007" Apr 24 22:32:50.021239 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:50.021203 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5b97b87699-44kpl"] Apr 24 22:32:50.052578 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:50.052548 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-55bbbfbc94-mrrkz"] Apr 24 22:32:50.055998 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:50.055978 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:32:50.070776 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:50.070747 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55bbbfbc94-mrrkz"] Apr 24 22:32:50.134289 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:50.134265 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-oauth-serving-cert\") pod \"console-55bbbfbc94-mrrkz\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:32:50.134404 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:50.134301 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-trusted-ca-bundle\") pod \"console-55bbbfbc94-mrrkz\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:32:50.134404 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:50.134330 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-console-config\") pod \"console-55bbbfbc94-mrrkz\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:32:50.134404 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:50.134390 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-console-oauth-config\") pod \"console-55bbbfbc94-mrrkz\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:32:50.134537 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:50.134442 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt5xb\" (UniqueName: \"kubernetes.io/projected/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-kube-api-access-qt5xb\") pod \"console-55bbbfbc94-mrrkz\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:32:50.134537 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:50.134476 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-console-serving-cert\") pod \"console-55bbbfbc94-mrrkz\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:32:50.134537 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:50.134494 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-service-ca\") pod \"console-55bbbfbc94-mrrkz\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:32:50.235445 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:50.235418 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-console-serving-cert\") pod \"console-55bbbfbc94-mrrkz\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:32:50.235445 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:50.235445 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-service-ca\") pod \"console-55bbbfbc94-mrrkz\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:32:50.235631 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:50.235578 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-oauth-serving-cert\") pod \"console-55bbbfbc94-mrrkz\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:32:50.235631 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:50.235618 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-trusted-ca-bundle\") pod \"console-55bbbfbc94-mrrkz\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:32:50.235760 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:50.235742 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-console-config\") pod \"console-55bbbfbc94-mrrkz\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:32:50.235819 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:50.235788 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-console-oauth-config\") pod \"console-55bbbfbc94-mrrkz\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:32:50.235876 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:50.235816 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qt5xb\" (UniqueName: \"kubernetes.io/projected/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-kube-api-access-qt5xb\") pod \"console-55bbbfbc94-mrrkz\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:32:50.236128 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:50.236076 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-service-ca\") pod \"console-55bbbfbc94-mrrkz\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:32:50.236271 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:50.236247 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-oauth-serving-cert\") pod \"console-55bbbfbc94-mrrkz\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:32:50.236429 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:50.236411 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-console-config\") pod \"console-55bbbfbc94-mrrkz\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:32:50.236490 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:50.236460 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-trusted-ca-bundle\") pod \"console-55bbbfbc94-mrrkz\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:32:50.237810 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:50.237792 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-console-serving-cert\") pod \"console-55bbbfbc94-mrrkz\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:32:50.237916 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:50.237898 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-console-oauth-config\") pod \"console-55bbbfbc94-mrrkz\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:32:50.243574 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:50.243549 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt5xb\" (UniqueName: \"kubernetes.io/projected/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-kube-api-access-qt5xb\") pod \"console-55bbbfbc94-mrrkz\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:32:50.366056 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:50.365986 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:32:50.688641 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:50.688619 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55bbbfbc94-mrrkz"] Apr 24 22:32:50.690953 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:32:50.690915 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d05ee12_3dd2_4abf_8b9e_4a2f0eceb34b.slice/crio-68c9648b502deb556ffd53c8b95564ae98ce1b0396599cb61a39d3f977870bf2 WatchSource:0}: Error finding container 68c9648b502deb556ffd53c8b95564ae98ce1b0396599cb61a39d3f977870bf2: Status 404 returned error can't find the container with id 68c9648b502deb556ffd53c8b95564ae98ce1b0396599cb61a39d3f977870bf2 Apr 24 22:32:51.092440 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:51.092405 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55bbbfbc94-mrrkz" event={"ID":"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b","Type":"ContainerStarted","Data":"43998b900a767cee3d9481c7207cef4df6924b11612bfbf8ae4eddd288138b71"} Apr 24 22:32:51.092440 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:51.092440 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55bbbfbc94-mrrkz" event={"ID":"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b","Type":"ContainerStarted","Data":"68c9648b502deb556ffd53c8b95564ae98ce1b0396599cb61a39d3f977870bf2"} Apr 24 22:32:51.111585 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:51.111534 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-55bbbfbc94-mrrkz" podStartSLOduration=1.111516218 podStartE2EDuration="1.111516218s" podCreationTimestamp="2026-04-24 22:32:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:32:51.109646216 +0000 UTC m=+178.257951764" watchObservedRunningTime="2026-04-24 22:32:51.111516218 +0000 UTC m=+178.259821767" Apr 24 22:32:52.005834 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:32:52.005801 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5b97b87699-44kpl" Apr 24 22:33:00.366822 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:00.366783 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:33:00.366822 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:00.366826 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:33:00.371774 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:00.371745 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:33:01.126996 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:01.126966 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:33:01.198831 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:01.198799 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5dbff5548c-mbt58"] Apr 24 22:33:15.040747 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:15.040686 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5b97b87699-44kpl" podUID="8fa15ea2-eb77-4643-8f97-5b6eff0695eb" containerName="console" containerID="cri-o://c8c52e0adfd5166d9e3194f90fb0d6a8e3b4aa43108ab84dd5a75c9132d8f88f" gracePeriod=15 Apr 24 22:33:15.168204 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:15.168183 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b97b87699-44kpl_8fa15ea2-eb77-4643-8f97-5b6eff0695eb/console/0.log" Apr 24 22:33:15.168309 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:15.168221 2572 generic.go:358] "Generic (PLEG): container finished" podID="8fa15ea2-eb77-4643-8f97-5b6eff0695eb" containerID="c8c52e0adfd5166d9e3194f90fb0d6a8e3b4aa43108ab84dd5a75c9132d8f88f" exitCode=2 Apr 24 22:33:15.168357 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:15.168308 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b97b87699-44kpl" event={"ID":"8fa15ea2-eb77-4643-8f97-5b6eff0695eb","Type":"ContainerDied","Data":"c8c52e0adfd5166d9e3194f90fb0d6a8e3b4aa43108ab84dd5a75c9132d8f88f"} Apr 24 22:33:15.283419 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:15.283399 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b97b87699-44kpl_8fa15ea2-eb77-4643-8f97-5b6eff0695eb/console/0.log" Apr 24 22:33:15.283517 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:15.283461 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b97b87699-44kpl" Apr 24 22:33:15.321552 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:15.321502 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-service-ca\") pod \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " Apr 24 22:33:15.321644 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:15.321563 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-console-config\") pod \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " Apr 24 22:33:15.321682 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:15.321668 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-console-serving-cert\") pod \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " Apr 24 22:33:15.321716 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:15.321695 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-trusted-ca-bundle\") pod \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " Apr 24 22:33:15.321766 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:15.321734 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-console-oauth-config\") pod \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " Apr 24 22:33:15.321766 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:15.321758 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6lfw\" (UniqueName: \"kubernetes.io/projected/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-kube-api-access-z6lfw\") pod \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " Apr 24 22:33:15.321866 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:15.321783 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-oauth-serving-cert\") pod \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\" (UID: \"8fa15ea2-eb77-4643-8f97-5b6eff0695eb\") " Apr 24 22:33:15.321866 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:15.321820 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-service-ca" (OuterVolumeSpecName: "service-ca") pod "8fa15ea2-eb77-4643-8f97-5b6eff0695eb" (UID: "8fa15ea2-eb77-4643-8f97-5b6eff0695eb"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:15.321962 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:15.321891 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-console-config" (OuterVolumeSpecName: "console-config") pod "8fa15ea2-eb77-4643-8f97-5b6eff0695eb" (UID: "8fa15ea2-eb77-4643-8f97-5b6eff0695eb"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:15.322184 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:15.322156 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8fa15ea2-eb77-4643-8f97-5b6eff0695eb" (UID: "8fa15ea2-eb77-4643-8f97-5b6eff0695eb"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:15.322290 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:15.322208 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8fa15ea2-eb77-4643-8f97-5b6eff0695eb" (UID: "8fa15ea2-eb77-4643-8f97-5b6eff0695eb"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:15.322290 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:15.322220 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-service-ca\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:33:15.322290 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:15.322240 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-console-config\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:33:15.323673 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:15.323652 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8fa15ea2-eb77-4643-8f97-5b6eff0695eb" (UID: "8fa15ea2-eb77-4643-8f97-5b6eff0695eb"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:15.323761 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:15.323732 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8fa15ea2-eb77-4643-8f97-5b6eff0695eb" (UID: "8fa15ea2-eb77-4643-8f97-5b6eff0695eb"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:15.323822 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:15.323788 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-kube-api-access-z6lfw" (OuterVolumeSpecName: "kube-api-access-z6lfw") pod "8fa15ea2-eb77-4643-8f97-5b6eff0695eb" (UID: "8fa15ea2-eb77-4643-8f97-5b6eff0695eb"). InnerVolumeSpecName "kube-api-access-z6lfw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:33:15.422638 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:15.422618 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-console-serving-cert\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:33:15.422638 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:15.422637 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-trusted-ca-bundle\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:33:15.422750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:15.422647 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-console-oauth-config\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:33:15.422750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:15.422656 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z6lfw\" (UniqueName: \"kubernetes.io/projected/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-kube-api-access-z6lfw\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:33:15.422750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:15.422664 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8fa15ea2-eb77-4643-8f97-5b6eff0695eb-oauth-serving-cert\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:33:16.173248 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:16.173223 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b97b87699-44kpl_8fa15ea2-eb77-4643-8f97-5b6eff0695eb/console/0.log" Apr 24 22:33:16.173584 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:16.173275 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b97b87699-44kpl" event={"ID":"8fa15ea2-eb77-4643-8f97-5b6eff0695eb","Type":"ContainerDied","Data":"516be2588f578d34f33e49ef5af3dd3db7f6896ab416e8c6027ae928a2960e1b"} Apr 24 22:33:16.173584 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:16.173315 2572 scope.go:117] "RemoveContainer" containerID="c8c52e0adfd5166d9e3194f90fb0d6a8e3b4aa43108ab84dd5a75c9132d8f88f" Apr 24 22:33:16.173810 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:16.173776 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b97b87699-44kpl" Apr 24 22:33:16.195906 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:16.195880 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5b97b87699-44kpl"] Apr 24 22:33:16.199290 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:16.199265 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5b97b87699-44kpl"] Apr 24 22:33:17.417102 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:17.417067 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fa15ea2-eb77-4643-8f97-5b6eff0695eb" path="/var/lib/kubelet/pods/8fa15ea2-eb77-4643-8f97-5b6eff0695eb/volumes" Apr 24 22:33:26.217284 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:26.217218 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5dbff5548c-mbt58" podUID="e439b2ed-4157-4b89-8773-c42bc87265e8" containerName="console" containerID="cri-o://988a6deac8228acf783ee21f4ec57dedae0e14c0acb5e3bc2661aec3cc818a6e" gracePeriod=15 Apr 24 22:33:26.458163 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:26.458143 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5dbff5548c-mbt58_e439b2ed-4157-4b89-8773-c42bc87265e8/console/0.log" Apr 24 22:33:26.458282 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:26.458236 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:33:26.502766 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:26.502744 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e439b2ed-4157-4b89-8773-c42bc87265e8-console-serving-cert\") pod \"e439b2ed-4157-4b89-8773-c42bc87265e8\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " Apr 24 22:33:26.502881 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:26.502773 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e439b2ed-4157-4b89-8773-c42bc87265e8-service-ca\") pod \"e439b2ed-4157-4b89-8773-c42bc87265e8\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " Apr 24 22:33:26.502881 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:26.502832 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e439b2ed-4157-4b89-8773-c42bc87265e8-oauth-serving-cert\") pod \"e439b2ed-4157-4b89-8773-c42bc87265e8\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " Apr 24 22:33:26.502881 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:26.502870 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e439b2ed-4157-4b89-8773-c42bc87265e8-console-config\") pod \"e439b2ed-4157-4b89-8773-c42bc87265e8\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " Apr 24 22:33:26.503051 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:26.502898 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e439b2ed-4157-4b89-8773-c42bc87265e8-console-oauth-config\") pod \"e439b2ed-4157-4b89-8773-c42bc87265e8\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " Apr 24 22:33:26.503051 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:26.502929 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh2x2\" (UniqueName: \"kubernetes.io/projected/e439b2ed-4157-4b89-8773-c42bc87265e8-kube-api-access-zh2x2\") pod \"e439b2ed-4157-4b89-8773-c42bc87265e8\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " Apr 24 22:33:26.503051 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:26.502961 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e439b2ed-4157-4b89-8773-c42bc87265e8-trusted-ca-bundle\") pod \"e439b2ed-4157-4b89-8773-c42bc87265e8\" (UID: \"e439b2ed-4157-4b89-8773-c42bc87265e8\") " Apr 24 22:33:26.503402 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:26.503263 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e439b2ed-4157-4b89-8773-c42bc87265e8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e439b2ed-4157-4b89-8773-c42bc87265e8" (UID: "e439b2ed-4157-4b89-8773-c42bc87265e8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:26.503506 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:26.503424 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e439b2ed-4157-4b89-8773-c42bc87265e8-service-ca" (OuterVolumeSpecName: "service-ca") pod "e439b2ed-4157-4b89-8773-c42bc87265e8" (UID: "e439b2ed-4157-4b89-8773-c42bc87265e8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:26.503506 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:26.503438 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e439b2ed-4157-4b89-8773-c42bc87265e8-oauth-serving-cert\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:33:26.503639 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:26.503615 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e439b2ed-4157-4b89-8773-c42bc87265e8-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e439b2ed-4157-4b89-8773-c42bc87265e8" (UID: "e439b2ed-4157-4b89-8773-c42bc87265e8"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:26.503690 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:26.503647 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e439b2ed-4157-4b89-8773-c42bc87265e8-console-config" (OuterVolumeSpecName: "console-config") pod "e439b2ed-4157-4b89-8773-c42bc87265e8" (UID: "e439b2ed-4157-4b89-8773-c42bc87265e8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:26.505143 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:26.505121 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e439b2ed-4157-4b89-8773-c42bc87265e8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e439b2ed-4157-4b89-8773-c42bc87265e8" (UID: "e439b2ed-4157-4b89-8773-c42bc87265e8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:26.505269 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:26.505254 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e439b2ed-4157-4b89-8773-c42bc87265e8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e439b2ed-4157-4b89-8773-c42bc87265e8" (UID: "e439b2ed-4157-4b89-8773-c42bc87265e8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:26.505351 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:26.505331 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e439b2ed-4157-4b89-8773-c42bc87265e8-kube-api-access-zh2x2" (OuterVolumeSpecName: "kube-api-access-zh2x2") pod "e439b2ed-4157-4b89-8773-c42bc87265e8" (UID: "e439b2ed-4157-4b89-8773-c42bc87265e8"). InnerVolumeSpecName "kube-api-access-zh2x2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:33:26.604672 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:26.604648 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e439b2ed-4157-4b89-8773-c42bc87265e8-console-config\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:33:26.604672 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:26.604671 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e439b2ed-4157-4b89-8773-c42bc87265e8-console-oauth-config\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:33:26.604793 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:26.604681 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zh2x2\" (UniqueName: \"kubernetes.io/projected/e439b2ed-4157-4b89-8773-c42bc87265e8-kube-api-access-zh2x2\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:33:26.604793 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:26.604691 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e439b2ed-4157-4b89-8773-c42bc87265e8-trusted-ca-bundle\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:33:26.604793 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:26.604700 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e439b2ed-4157-4b89-8773-c42bc87265e8-console-serving-cert\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:33:26.604793 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:26.604709 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e439b2ed-4157-4b89-8773-c42bc87265e8-service-ca\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:33:27.213083 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:27.213056 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5dbff5548c-mbt58_e439b2ed-4157-4b89-8773-c42bc87265e8/console/0.log" Apr 24 22:33:27.213208 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:27.213100 2572 generic.go:358] "Generic (PLEG): container finished" podID="e439b2ed-4157-4b89-8773-c42bc87265e8" containerID="988a6deac8228acf783ee21f4ec57dedae0e14c0acb5e3bc2661aec3cc818a6e" exitCode=2 Apr 24 22:33:27.213208 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:27.213170 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dbff5548c-mbt58" event={"ID":"e439b2ed-4157-4b89-8773-c42bc87265e8","Type":"ContainerDied","Data":"988a6deac8228acf783ee21f4ec57dedae0e14c0acb5e3bc2661aec3cc818a6e"} Apr 24 22:33:27.213208 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:27.213188 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dbff5548c-mbt58" Apr 24 22:33:27.213208 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:27.213200 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dbff5548c-mbt58" event={"ID":"e439b2ed-4157-4b89-8773-c42bc87265e8","Type":"ContainerDied","Data":"1555fb7bef2f6216db1d539b9b1f2544490b4ed470ea1edb19dd5466a7442812"} Apr 24 22:33:27.213355 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:27.213217 2572 scope.go:117] "RemoveContainer" containerID="988a6deac8228acf783ee21f4ec57dedae0e14c0acb5e3bc2661aec3cc818a6e" Apr 24 22:33:27.224245 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:27.224215 2572 scope.go:117] "RemoveContainer" containerID="988a6deac8228acf783ee21f4ec57dedae0e14c0acb5e3bc2661aec3cc818a6e" Apr 24 22:33:27.225069 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:33:27.225034 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"988a6deac8228acf783ee21f4ec57dedae0e14c0acb5e3bc2661aec3cc818a6e\": container with ID starting with 988a6deac8228acf783ee21f4ec57dedae0e14c0acb5e3bc2661aec3cc818a6e not found: ID does not exist" containerID="988a6deac8228acf783ee21f4ec57dedae0e14c0acb5e3bc2661aec3cc818a6e" Apr 24 22:33:27.225286 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:27.225213 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"988a6deac8228acf783ee21f4ec57dedae0e14c0acb5e3bc2661aec3cc818a6e"} err="failed to get container status \"988a6deac8228acf783ee21f4ec57dedae0e14c0acb5e3bc2661aec3cc818a6e\": rpc error: code = NotFound desc = could not find container \"988a6deac8228acf783ee21f4ec57dedae0e14c0acb5e3bc2661aec3cc818a6e\": container with ID starting with 988a6deac8228acf783ee21f4ec57dedae0e14c0acb5e3bc2661aec3cc818a6e not found: ID does not exist" Apr 24 22:33:27.241809 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:27.241776 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5dbff5548c-mbt58"] Apr 24 22:33:27.244073 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:27.244051 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5dbff5548c-mbt58"] Apr 24 22:33:27.247497 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:33:27.247473 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode439b2ed_4157_4b89_8773_c42bc87265e8.slice\": RecentStats: unable to find data in memory cache]" Apr 24 22:33:27.247592 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:33:27.247573 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode439b2ed_4157_4b89_8773_c42bc87265e8.slice\": RecentStats: unable to find data in memory cache]" Apr 24 22:33:27.417419 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:27.417391 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e439b2ed-4157-4b89-8773-c42bc87265e8" path="/var/lib/kubelet/pods/e439b2ed-4157-4b89-8773-c42bc87265e8/volumes" Apr 24 22:33:34.076652 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:34.076616 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-bnnl8"] Apr 24 22:33:34.077043 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:34.076912 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e439b2ed-4157-4b89-8773-c42bc87265e8" containerName="console" Apr 24 22:33:34.077043 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:34.076924 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e439b2ed-4157-4b89-8773-c42bc87265e8" containerName="console" Apr 24 22:33:34.077043 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:34.076931 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8fa15ea2-eb77-4643-8f97-5b6eff0695eb" containerName="console" Apr 24 22:33:34.077043 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:34.076938 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fa15ea2-eb77-4643-8f97-5b6eff0695eb" containerName="console" Apr 24 22:33:34.077043 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:34.076981 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8fa15ea2-eb77-4643-8f97-5b6eff0695eb" containerName="console" Apr 24 22:33:34.077043 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:34.076990 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e439b2ed-4157-4b89-8773-c42bc87265e8" containerName="console" Apr 24 22:33:34.081198 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:34.081177 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bnnl8" Apr 24 22:33:34.083732 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:34.083709 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 22:33:34.089328 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:34.089306 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bnnl8"] Apr 24 22:33:34.158541 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:34.158518 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/10c617f4-fae1-4f73-af0e-8e8500ece009-kubelet-config\") pod \"global-pull-secret-syncer-bnnl8\" (UID: \"10c617f4-fae1-4f73-af0e-8e8500ece009\") " pod="kube-system/global-pull-secret-syncer-bnnl8" Apr 24 22:33:34.158626 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:34.158558 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/10c617f4-fae1-4f73-af0e-8e8500ece009-original-pull-secret\") pod \"global-pull-secret-syncer-bnnl8\" (UID: \"10c617f4-fae1-4f73-af0e-8e8500ece009\") " pod="kube-system/global-pull-secret-syncer-bnnl8" Apr 24 22:33:34.158626 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:34.158582 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/10c617f4-fae1-4f73-af0e-8e8500ece009-dbus\") pod \"global-pull-secret-syncer-bnnl8\" (UID: \"10c617f4-fae1-4f73-af0e-8e8500ece009\") " pod="kube-system/global-pull-secret-syncer-bnnl8" Apr 24 22:33:34.259693 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:34.259670 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/10c617f4-fae1-4f73-af0e-8e8500ece009-kubelet-config\") pod \"global-pull-secret-syncer-bnnl8\" (UID: \"10c617f4-fae1-4f73-af0e-8e8500ece009\") " pod="kube-system/global-pull-secret-syncer-bnnl8" Apr 24 22:33:34.259791 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:34.259711 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/10c617f4-fae1-4f73-af0e-8e8500ece009-original-pull-secret\") pod \"global-pull-secret-syncer-bnnl8\" (UID: \"10c617f4-fae1-4f73-af0e-8e8500ece009\") " pod="kube-system/global-pull-secret-syncer-bnnl8" Apr 24 22:33:34.259791 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:34.259735 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/10c617f4-fae1-4f73-af0e-8e8500ece009-dbus\") pod \"global-pull-secret-syncer-bnnl8\" (UID: \"10c617f4-fae1-4f73-af0e-8e8500ece009\") " pod="kube-system/global-pull-secret-syncer-bnnl8" Apr 24 22:33:34.259861 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:34.259792 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/10c617f4-fae1-4f73-af0e-8e8500ece009-kubelet-config\") pod \"global-pull-secret-syncer-bnnl8\" (UID: \"10c617f4-fae1-4f73-af0e-8e8500ece009\") " pod="kube-system/global-pull-secret-syncer-bnnl8" Apr 24 22:33:34.259895 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:34.259874 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/10c617f4-fae1-4f73-af0e-8e8500ece009-dbus\") pod \"global-pull-secret-syncer-bnnl8\" (UID: \"10c617f4-fae1-4f73-af0e-8e8500ece009\") " pod="kube-system/global-pull-secret-syncer-bnnl8" Apr 24 22:33:34.261971 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:34.261952 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/10c617f4-fae1-4f73-af0e-8e8500ece009-original-pull-secret\") pod \"global-pull-secret-syncer-bnnl8\" (UID: \"10c617f4-fae1-4f73-af0e-8e8500ece009\") " pod="kube-system/global-pull-secret-syncer-bnnl8" Apr 24 22:33:34.391389 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:34.391311 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bnnl8" Apr 24 22:33:34.509954 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:34.509931 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bnnl8"] Apr 24 22:33:34.512424 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:33:34.512397 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c617f4_fae1_4f73_af0e_8e8500ece009.slice/crio-a3b603abd4158841dae9df5882774f1ac9ca578f56851444c59a61581d9c2172 WatchSource:0}: Error finding container a3b603abd4158841dae9df5882774f1ac9ca578f56851444c59a61581d9c2172: Status 404 returned error can't find the container with id a3b603abd4158841dae9df5882774f1ac9ca578f56851444c59a61581d9c2172 Apr 24 22:33:35.238773 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:35.238739 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bnnl8" event={"ID":"10c617f4-fae1-4f73-af0e-8e8500ece009","Type":"ContainerStarted","Data":"a3b603abd4158841dae9df5882774f1ac9ca578f56851444c59a61581d9c2172"} Apr 24 22:33:39.252810 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:39.252769 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bnnl8" event={"ID":"10c617f4-fae1-4f73-af0e-8e8500ece009","Type":"ContainerStarted","Data":"230fcdb72f13cadef014536150bc327ea23276e2d4d25e4796a3d86b6eb0fc34"} Apr 24 22:33:39.270726 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:33:39.270680 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-bnnl8" podStartSLOduration=1.601200549 podStartE2EDuration="5.270663777s" podCreationTimestamp="2026-04-24 22:33:34 +0000 UTC" firstStartedPulling="2026-04-24 22:33:34.514095941 +0000 UTC m=+221.662401467" lastFinishedPulling="2026-04-24 22:33:38.183559159 +0000 UTC m=+225.331864695" observedRunningTime="2026-04-24 22:33:39.268767772 +0000 UTC m=+226.417073316" watchObservedRunningTime="2026-04-24 22:33:39.270663777 +0000 UTC m=+226.418969363" Apr 24 22:34:53.330073 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:34:53.330046 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 22:36:40.011266 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:40.011233 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-cz254"] Apr 24 22:36:40.013990 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:40.013971 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-cz254" Apr 24 22:36:40.016873 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:40.016848 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 24 22:36:40.017072 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:40.016990 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 22:36:40.017072 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:40.017002 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-6krr4\"" Apr 24 22:36:40.018341 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:40.018327 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 22:36:40.024163 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:40.024143 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-cz254"] Apr 24 22:36:40.094568 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:40.094547 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6cfa4b36-c34f-4afc-874b-e5c120a77e1c-tls-certs\") pod \"model-serving-api-86f7b4b499-cz254\" (UID: \"6cfa4b36-c34f-4afc-874b-e5c120a77e1c\") " pod="kserve/model-serving-api-86f7b4b499-cz254" Apr 24 22:36:40.094678 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:40.094613 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srg26\" (UniqueName: \"kubernetes.io/projected/6cfa4b36-c34f-4afc-874b-e5c120a77e1c-kube-api-access-srg26\") pod \"model-serving-api-86f7b4b499-cz254\" (UID: \"6cfa4b36-c34f-4afc-874b-e5c120a77e1c\") " pod="kserve/model-serving-api-86f7b4b499-cz254" Apr 24 22:36:40.195124 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:40.195096 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6cfa4b36-c34f-4afc-874b-e5c120a77e1c-tls-certs\") pod \"model-serving-api-86f7b4b499-cz254\" (UID: \"6cfa4b36-c34f-4afc-874b-e5c120a77e1c\") " pod="kserve/model-serving-api-86f7b4b499-cz254" Apr 24 22:36:40.195234 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:40.195153 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-srg26\" (UniqueName: \"kubernetes.io/projected/6cfa4b36-c34f-4afc-874b-e5c120a77e1c-kube-api-access-srg26\") pod \"model-serving-api-86f7b4b499-cz254\" (UID: \"6cfa4b36-c34f-4afc-874b-e5c120a77e1c\") " pod="kserve/model-serving-api-86f7b4b499-cz254" Apr 24 22:36:40.195275 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:36:40.195239 2572 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 24 22:36:40.195315 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:36:40.195304 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cfa4b36-c34f-4afc-874b-e5c120a77e1c-tls-certs podName:6cfa4b36-c34f-4afc-874b-e5c120a77e1c nodeName:}" failed. No retries permitted until 2026-04-24 22:36:40.695287941 +0000 UTC m=+407.843593467 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/6cfa4b36-c34f-4afc-874b-e5c120a77e1c-tls-certs") pod "model-serving-api-86f7b4b499-cz254" (UID: "6cfa4b36-c34f-4afc-874b-e5c120a77e1c") : secret "model-serving-api-tls" not found Apr 24 22:36:40.204181 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:40.204162 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-srg26\" (UniqueName: \"kubernetes.io/projected/6cfa4b36-c34f-4afc-874b-e5c120a77e1c-kube-api-access-srg26\") pod \"model-serving-api-86f7b4b499-cz254\" (UID: \"6cfa4b36-c34f-4afc-874b-e5c120a77e1c\") " pod="kserve/model-serving-api-86f7b4b499-cz254" Apr 24 22:36:40.699132 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:40.699102 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6cfa4b36-c34f-4afc-874b-e5c120a77e1c-tls-certs\") pod \"model-serving-api-86f7b4b499-cz254\" (UID: \"6cfa4b36-c34f-4afc-874b-e5c120a77e1c\") " pod="kserve/model-serving-api-86f7b4b499-cz254" Apr 24 22:36:40.701260 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:40.701230 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6cfa4b36-c34f-4afc-874b-e5c120a77e1c-tls-certs\") pod \"model-serving-api-86f7b4b499-cz254\" (UID: \"6cfa4b36-c34f-4afc-874b-e5c120a77e1c\") " pod="kserve/model-serving-api-86f7b4b499-cz254" Apr 24 22:36:40.924584 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:40.924539 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-cz254" Apr 24 22:36:41.051339 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:41.051314 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-cz254"] Apr 24 22:36:41.053052 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:36:41.053026 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cfa4b36_c34f_4afc_874b_e5c120a77e1c.slice/crio-aea32bb7ded0167e36ef6c54ea6b9691a78762ace161862a285bbfcbd002adbb WatchSource:0}: Error finding container aea32bb7ded0167e36ef6c54ea6b9691a78762ace161862a285bbfcbd002adbb: Status 404 returned error can't find the container with id aea32bb7ded0167e36ef6c54ea6b9691a78762ace161862a285bbfcbd002adbb Apr 24 22:36:41.054808 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:41.054792 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:36:41.800813 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:41.800781 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-cz254" event={"ID":"6cfa4b36-c34f-4afc-874b-e5c120a77e1c","Type":"ContainerStarted","Data":"aea32bb7ded0167e36ef6c54ea6b9691a78762ace161862a285bbfcbd002adbb"} Apr 24 22:36:43.807843 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:43.807806 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-cz254" event={"ID":"6cfa4b36-c34f-4afc-874b-e5c120a77e1c","Type":"ContainerStarted","Data":"d5d7622c8d516e20716bf665bea7a9aea9f672882952dbf0de3c6421ca64cbdb"} Apr 24 22:36:43.808232 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:43.807948 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-cz254" Apr 24 22:36:43.824382 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:43.824342 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-cz254" podStartSLOduration=2.628624008 podStartE2EDuration="4.824330769s" podCreationTimestamp="2026-04-24 22:36:39 +0000 UTC" firstStartedPulling="2026-04-24 22:36:41.054949425 +0000 UTC m=+408.203254951" lastFinishedPulling="2026-04-24 22:36:43.250656186 +0000 UTC m=+410.398961712" observedRunningTime="2026-04-24 22:36:43.824178566 +0000 UTC m=+410.972484125" watchObservedRunningTime="2026-04-24 22:36:43.824330769 +0000 UTC m=+410.972636317" Apr 24 22:36:54.814891 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:54.814861 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-cz254" Apr 24 22:36:54.835549 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:54.835524 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-px5c7"] Apr 24 22:36:54.838998 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:54.838973 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-px5c7" Apr 24 22:36:54.842852 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:54.842827 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-vqhfc\"" Apr 24 22:36:54.843208 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:54.843192 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 22:36:54.863221 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:54.863198 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-px5c7"] Apr 24 22:36:54.901637 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:54.901616 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj6m7\" (UniqueName: \"kubernetes.io/projected/491b72ff-7459-4493-9992-dcf733d6c92e-kube-api-access-sj6m7\") pod \"s3-init-px5c7\" (UID: \"491b72ff-7459-4493-9992-dcf733d6c92e\") " pod="kserve/s3-init-px5c7" Apr 24 22:36:55.002242 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:55.002219 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sj6m7\" (UniqueName: \"kubernetes.io/projected/491b72ff-7459-4493-9992-dcf733d6c92e-kube-api-access-sj6m7\") pod \"s3-init-px5c7\" (UID: \"491b72ff-7459-4493-9992-dcf733d6c92e\") " pod="kserve/s3-init-px5c7" Apr 24 22:36:55.015324 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:55.015295 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj6m7\" (UniqueName: \"kubernetes.io/projected/491b72ff-7459-4493-9992-dcf733d6c92e-kube-api-access-sj6m7\") pod \"s3-init-px5c7\" (UID: \"491b72ff-7459-4493-9992-dcf733d6c92e\") " pod="kserve/s3-init-px5c7" Apr 24 22:36:55.160588 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:55.160525 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-px5c7" Apr 24 22:36:55.289796 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:55.289772 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-px5c7"] Apr 24 22:36:55.291685 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:36:55.291655 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod491b72ff_7459_4493_9992_dcf733d6c92e.slice/crio-194bc13db7ed9e264ffa592c668c2b5a9c0e5a64c4e715588b8492b01403b518 WatchSource:0}: Error finding container 194bc13db7ed9e264ffa592c668c2b5a9c0e5a64c4e715588b8492b01403b518: Status 404 returned error can't find the container with id 194bc13db7ed9e264ffa592c668c2b5a9c0e5a64c4e715588b8492b01403b518 Apr 24 22:36:55.846799 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:55.846748 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-px5c7" event={"ID":"491b72ff-7459-4493-9992-dcf733d6c92e","Type":"ContainerStarted","Data":"194bc13db7ed9e264ffa592c668c2b5a9c0e5a64c4e715588b8492b01403b518"} Apr 24 22:36:59.860799 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:59.860763 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-px5c7" event={"ID":"491b72ff-7459-4493-9992-dcf733d6c92e","Type":"ContainerStarted","Data":"4b9b728c3eaea178a34f6207e560e6fcafec594bf503a05f90ff9a97d668be09"} Apr 24 22:36:59.876466 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:36:59.876420 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-px5c7" podStartSLOduration=1.474671756 podStartE2EDuration="5.876407075s" podCreationTimestamp="2026-04-24 22:36:54 +0000 UTC" firstStartedPulling="2026-04-24 22:36:55.293586186 +0000 UTC m=+422.441891724" lastFinishedPulling="2026-04-24 22:36:59.695321511 +0000 UTC m=+426.843627043" observedRunningTime="2026-04-24 22:36:59.875172473 +0000 UTC m=+427.023478022" watchObservedRunningTime="2026-04-24 22:36:59.876407075 +0000 UTC m=+427.024712642" Apr 24 22:37:02.872191 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:02.872126 2572 generic.go:358] "Generic (PLEG): container finished" podID="491b72ff-7459-4493-9992-dcf733d6c92e" containerID="4b9b728c3eaea178a34f6207e560e6fcafec594bf503a05f90ff9a97d668be09" exitCode=0 Apr 24 22:37:02.872513 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:02.872197 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-px5c7" event={"ID":"491b72ff-7459-4493-9992-dcf733d6c92e","Type":"ContainerDied","Data":"4b9b728c3eaea178a34f6207e560e6fcafec594bf503a05f90ff9a97d668be09"} Apr 24 22:37:04.001584 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:04.001565 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-px5c7" Apr 24 22:37:04.076365 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:04.076340 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj6m7\" (UniqueName: \"kubernetes.io/projected/491b72ff-7459-4493-9992-dcf733d6c92e-kube-api-access-sj6m7\") pod \"491b72ff-7459-4493-9992-dcf733d6c92e\" (UID: \"491b72ff-7459-4493-9992-dcf733d6c92e\") " Apr 24 22:37:04.078353 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:04.078331 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/491b72ff-7459-4493-9992-dcf733d6c92e-kube-api-access-sj6m7" (OuterVolumeSpecName: "kube-api-access-sj6m7") pod "491b72ff-7459-4493-9992-dcf733d6c92e" (UID: "491b72ff-7459-4493-9992-dcf733d6c92e"). InnerVolumeSpecName "kube-api-access-sj6m7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:37:04.176850 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:04.176799 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sj6m7\" (UniqueName: \"kubernetes.io/projected/491b72ff-7459-4493-9992-dcf733d6c92e-kube-api-access-sj6m7\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:37:04.879577 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:04.879546 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-px5c7" event={"ID":"491b72ff-7459-4493-9992-dcf733d6c92e","Type":"ContainerDied","Data":"194bc13db7ed9e264ffa592c668c2b5a9c0e5a64c4e715588b8492b01403b518"} Apr 24 22:37:04.879577 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:04.879564 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-px5c7" Apr 24 22:37:04.879577 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:04.879576 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="194bc13db7ed9e264ffa592c668c2b5a9c0e5a64c4e715588b8492b01403b518" Apr 24 22:37:13.982740 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:13.982652 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt"] Apr 24 22:37:13.983247 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:13.982993 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="491b72ff-7459-4493-9992-dcf733d6c92e" containerName="s3-init" Apr 24 22:37:13.983247 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:13.983004 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="491b72ff-7459-4493-9992-dcf733d6c92e" containerName="s3-init" Apr 24 22:37:13.983247 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:13.983073 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="491b72ff-7459-4493-9992-dcf733d6c92e" containerName="s3-init" Apr 24 22:37:13.987059 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:13.987037 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" Apr 24 22:37:13.990891 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:13.990860 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-dd47d-predictor-serving-cert\"" Apr 24 22:37:13.991089 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:13.991069 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 22:37:13.991567 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:13.991548 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-gtqzv\"" Apr 24 22:37:13.991613 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:13.991541 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-dd47d-kube-rbac-proxy-sar-config\"" Apr 24 22:37:13.992390 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:13.992372 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 22:37:14.000613 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:14.000590 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt"] Apr 24 22:37:14.043716 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:14.043694 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-raw-sklearn-batcher-dd47d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5cf881dd-dc1a-4ac1-8717-4cff33fc74be-isvc-raw-sklearn-batcher-dd47d-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt\" (UID: \"5cf881dd-dc1a-4ac1-8717-4cff33fc74be\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" Apr 24 22:37:14.043815 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:14.043735 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5cf881dd-dc1a-4ac1-8717-4cff33fc74be-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt\" (UID: \"5cf881dd-dc1a-4ac1-8717-4cff33fc74be\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" Apr 24 22:37:14.043815 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:14.043763 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frz59\" (UniqueName: \"kubernetes.io/projected/5cf881dd-dc1a-4ac1-8717-4cff33fc74be-kube-api-access-frz59\") pod \"isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt\" (UID: \"5cf881dd-dc1a-4ac1-8717-4cff33fc74be\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" Apr 24 22:37:14.043888 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:14.043844 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cf881dd-dc1a-4ac1-8717-4cff33fc74be-proxy-tls\") pod \"isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt\" (UID: \"5cf881dd-dc1a-4ac1-8717-4cff33fc74be\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" Apr 24 22:37:14.144955 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:14.144933 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-raw-sklearn-batcher-dd47d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5cf881dd-dc1a-4ac1-8717-4cff33fc74be-isvc-raw-sklearn-batcher-dd47d-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt\" (UID: \"5cf881dd-dc1a-4ac1-8717-4cff33fc74be\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" Apr 24 22:37:14.145082 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:14.144970 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5cf881dd-dc1a-4ac1-8717-4cff33fc74be-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt\" (UID: \"5cf881dd-dc1a-4ac1-8717-4cff33fc74be\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" Apr 24 22:37:14.145082 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:14.144989 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frz59\" (UniqueName: \"kubernetes.io/projected/5cf881dd-dc1a-4ac1-8717-4cff33fc74be-kube-api-access-frz59\") pod \"isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt\" (UID: \"5cf881dd-dc1a-4ac1-8717-4cff33fc74be\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" Apr 24 22:37:14.145082 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:14.145061 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cf881dd-dc1a-4ac1-8717-4cff33fc74be-proxy-tls\") pod \"isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt\" (UID: \"5cf881dd-dc1a-4ac1-8717-4cff33fc74be\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" Apr 24 22:37:14.145464 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:14.145441 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5cf881dd-dc1a-4ac1-8717-4cff33fc74be-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt\" (UID: \"5cf881dd-dc1a-4ac1-8717-4cff33fc74be\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" Apr 24 22:37:14.145658 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:14.145640 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-raw-sklearn-batcher-dd47d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5cf881dd-dc1a-4ac1-8717-4cff33fc74be-isvc-raw-sklearn-batcher-dd47d-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt\" (UID: \"5cf881dd-dc1a-4ac1-8717-4cff33fc74be\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" Apr 24 22:37:14.147514 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:14.147497 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cf881dd-dc1a-4ac1-8717-4cff33fc74be-proxy-tls\") pod \"isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt\" (UID: \"5cf881dd-dc1a-4ac1-8717-4cff33fc74be\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" Apr 24 22:37:14.154386 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:14.154363 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frz59\" (UniqueName: \"kubernetes.io/projected/5cf881dd-dc1a-4ac1-8717-4cff33fc74be-kube-api-access-frz59\") pod \"isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt\" (UID: \"5cf881dd-dc1a-4ac1-8717-4cff33fc74be\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" Apr 24 22:37:14.298318 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:14.298263 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" Apr 24 22:37:14.420775 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:14.420746 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt"] Apr 24 22:37:14.423646 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:37:14.423612 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cf881dd_dc1a_4ac1_8717_4cff33fc74be.slice/crio-91fbd3da666542b7ab6fe432892577f9af5dcef6c2c6b331d53cc1486b1d7854 WatchSource:0}: Error finding container 91fbd3da666542b7ab6fe432892577f9af5dcef6c2c6b331d53cc1486b1d7854: Status 404 returned error can't find the container with id 91fbd3da666542b7ab6fe432892577f9af5dcef6c2c6b331d53cc1486b1d7854 Apr 24 22:37:14.910219 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:14.910187 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" event={"ID":"5cf881dd-dc1a-4ac1-8717-4cff33fc74be","Type":"ContainerStarted","Data":"91fbd3da666542b7ab6fe432892577f9af5dcef6c2c6b331d53cc1486b1d7854"} Apr 24 22:37:18.923954 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:18.923913 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" event={"ID":"5cf881dd-dc1a-4ac1-8717-4cff33fc74be","Type":"ContainerStarted","Data":"fd397ea13b6d79d0ab05331542134b0fe30018ae66b7e01c268a465cceaa6744"} Apr 24 22:37:21.934596 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:21.934518 2572 generic.go:358] "Generic (PLEG): container finished" podID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerID="fd397ea13b6d79d0ab05331542134b0fe30018ae66b7e01c268a465cceaa6744" exitCode=0 Apr 24 22:37:21.934957 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:21.934598 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" event={"ID":"5cf881dd-dc1a-4ac1-8717-4cff33fc74be","Type":"ContainerDied","Data":"fd397ea13b6d79d0ab05331542134b0fe30018ae66b7e01c268a465cceaa6744"} Apr 24 22:37:34.984993 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:34.984966 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" event={"ID":"5cf881dd-dc1a-4ac1-8717-4cff33fc74be","Type":"ContainerStarted","Data":"f52f1857c57c857be065804869ddbd222e1f2742d14e22a785eb5b177b126db0"} Apr 24 22:37:37.996744 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:37.996709 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" event={"ID":"5cf881dd-dc1a-4ac1-8717-4cff33fc74be","Type":"ContainerStarted","Data":"51d7b30b73122c94f7f99843891b31cbb37254a2d04aaa01c11b3909e65460c0"} Apr 24 22:37:40.005477 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:40.005443 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" event={"ID":"5cf881dd-dc1a-4ac1-8717-4cff33fc74be","Type":"ContainerStarted","Data":"1bc6b0bcf38fee28c2674025520be03e775fc64480065b2bd7cb0691fb74db1f"} Apr 24 22:37:40.005855 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:40.005682 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" Apr 24 22:37:40.005855 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:40.005831 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" Apr 24 22:37:40.006978 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:40.006938 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 22:37:40.027860 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:40.027817 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podStartSLOduration=1.753790835 podStartE2EDuration="27.027804737s" podCreationTimestamp="2026-04-24 22:37:13 +0000 UTC" firstStartedPulling="2026-04-24 22:37:14.425538211 +0000 UTC m=+441.573843740" lastFinishedPulling="2026-04-24 22:37:39.699552113 +0000 UTC m=+466.847857642" observedRunningTime="2026-04-24 22:37:40.027340308 +0000 UTC m=+467.175645857" watchObservedRunningTime="2026-04-24 22:37:40.027804737 +0000 UTC m=+467.176110284" Apr 24 22:37:41.008846 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:41.008812 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" Apr 24 22:37:41.009291 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:41.008933 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 22:37:41.009742 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:41.009717 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:37:42.012033 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:42.011972 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 22:37:42.012531 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:42.012481 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:37:42.015894 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:42.015871 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" Apr 24 22:37:43.014329 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:43.014292 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 22:37:43.014767 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:43.014546 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:37:47.011416 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:47.011381 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-64496c466b-btqrw"] Apr 24 22:37:47.023543 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:47.023519 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64496c466b-btqrw" Apr 24 22:37:47.058497 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:47.058472 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64496c466b-btqrw"] Apr 24 22:37:47.210887 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:47.210860 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bda88ed8-ba87-422a-8cef-507c0c26da57-oauth-serving-cert\") pod \"console-64496c466b-btqrw\" (UID: \"bda88ed8-ba87-422a-8cef-507c0c26da57\") " pod="openshift-console/console-64496c466b-btqrw" Apr 24 22:37:47.211006 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:47.210921 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bda88ed8-ba87-422a-8cef-507c0c26da57-service-ca\") pod \"console-64496c466b-btqrw\" (UID: \"bda88ed8-ba87-422a-8cef-507c0c26da57\") " pod="openshift-console/console-64496c466b-btqrw" Apr 24 22:37:47.211006 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:47.210973 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5t8k\" (UniqueName: \"kubernetes.io/projected/bda88ed8-ba87-422a-8cef-507c0c26da57-kube-api-access-s5t8k\") pod \"console-64496c466b-btqrw\" (UID: \"bda88ed8-ba87-422a-8cef-507c0c26da57\") " pod="openshift-console/console-64496c466b-btqrw" Apr 24 22:37:47.211131 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:47.211037 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bda88ed8-ba87-422a-8cef-507c0c26da57-console-oauth-config\") pod \"console-64496c466b-btqrw\" (UID: \"bda88ed8-ba87-422a-8cef-507c0c26da57\") " pod="openshift-console/console-64496c466b-btqrw" Apr 24 22:37:47.211131 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:47.211069 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bda88ed8-ba87-422a-8cef-507c0c26da57-trusted-ca-bundle\") pod \"console-64496c466b-btqrw\" (UID: \"bda88ed8-ba87-422a-8cef-507c0c26da57\") " pod="openshift-console/console-64496c466b-btqrw" Apr 24 22:37:47.211131 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:47.211109 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bda88ed8-ba87-422a-8cef-507c0c26da57-console-serving-cert\") pod \"console-64496c466b-btqrw\" (UID: \"bda88ed8-ba87-422a-8cef-507c0c26da57\") " pod="openshift-console/console-64496c466b-btqrw" Apr 24 22:37:47.211131 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:47.211125 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bda88ed8-ba87-422a-8cef-507c0c26da57-console-config\") pod \"console-64496c466b-btqrw\" (UID: \"bda88ed8-ba87-422a-8cef-507c0c26da57\") " pod="openshift-console/console-64496c466b-btqrw" Apr 24 22:37:47.311978 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:47.311917 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bda88ed8-ba87-422a-8cef-507c0c26da57-console-serving-cert\") pod \"console-64496c466b-btqrw\" (UID: \"bda88ed8-ba87-422a-8cef-507c0c26da57\") " pod="openshift-console/console-64496c466b-btqrw" Apr 24 22:37:47.311978 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:47.311944 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bda88ed8-ba87-422a-8cef-507c0c26da57-console-config\") pod \"console-64496c466b-btqrw\" (UID: \"bda88ed8-ba87-422a-8cef-507c0c26da57\") " pod="openshift-console/console-64496c466b-btqrw" Apr 24 22:37:47.311978 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:47.311971 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bda88ed8-ba87-422a-8cef-507c0c26da57-oauth-serving-cert\") pod \"console-64496c466b-btqrw\" (UID: \"bda88ed8-ba87-422a-8cef-507c0c26da57\") " pod="openshift-console/console-64496c466b-btqrw" Apr 24 22:37:47.312219 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:47.312026 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bda88ed8-ba87-422a-8cef-507c0c26da57-service-ca\") pod \"console-64496c466b-btqrw\" (UID: \"bda88ed8-ba87-422a-8cef-507c0c26da57\") " pod="openshift-console/console-64496c466b-btqrw" Apr 24 22:37:47.312219 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:47.312064 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5t8k\" (UniqueName: \"kubernetes.io/projected/bda88ed8-ba87-422a-8cef-507c0c26da57-kube-api-access-s5t8k\") pod \"console-64496c466b-btqrw\" (UID: \"bda88ed8-ba87-422a-8cef-507c0c26da57\") " pod="openshift-console/console-64496c466b-btqrw" Apr 24 22:37:47.312219 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:47.312089 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bda88ed8-ba87-422a-8cef-507c0c26da57-console-oauth-config\") pod \"console-64496c466b-btqrw\" (UID: \"bda88ed8-ba87-422a-8cef-507c0c26da57\") " pod="openshift-console/console-64496c466b-btqrw" Apr 24 22:37:47.312369 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:47.312236 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bda88ed8-ba87-422a-8cef-507c0c26da57-trusted-ca-bundle\") pod \"console-64496c466b-btqrw\" (UID: \"bda88ed8-ba87-422a-8cef-507c0c26da57\") " pod="openshift-console/console-64496c466b-btqrw" Apr 24 22:37:47.312765 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:47.312740 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bda88ed8-ba87-422a-8cef-507c0c26da57-oauth-serving-cert\") pod \"console-64496c466b-btqrw\" (UID: \"bda88ed8-ba87-422a-8cef-507c0c26da57\") " pod="openshift-console/console-64496c466b-btqrw" Apr 24 22:37:47.312873 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:47.312833 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bda88ed8-ba87-422a-8cef-507c0c26da57-service-ca\") pod \"console-64496c466b-btqrw\" (UID: \"bda88ed8-ba87-422a-8cef-507c0c26da57\") " pod="openshift-console/console-64496c466b-btqrw" Apr 24 22:37:47.312971 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:47.312948 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bda88ed8-ba87-422a-8cef-507c0c26da57-console-config\") pod \"console-64496c466b-btqrw\" (UID: \"bda88ed8-ba87-422a-8cef-507c0c26da57\") " pod="openshift-console/console-64496c466b-btqrw" Apr 24 22:37:47.313444 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:47.313425 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bda88ed8-ba87-422a-8cef-507c0c26da57-trusted-ca-bundle\") pod \"console-64496c466b-btqrw\" (UID: \"bda88ed8-ba87-422a-8cef-507c0c26da57\") " pod="openshift-console/console-64496c466b-btqrw" Apr 24 22:37:47.314550 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:47.314530 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bda88ed8-ba87-422a-8cef-507c0c26da57-console-oauth-config\") pod \"console-64496c466b-btqrw\" (UID: \"bda88ed8-ba87-422a-8cef-507c0c26da57\") " pod="openshift-console/console-64496c466b-btqrw" Apr 24 22:37:47.314666 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:47.314650 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bda88ed8-ba87-422a-8cef-507c0c26da57-console-serving-cert\") pod \"console-64496c466b-btqrw\" (UID: \"bda88ed8-ba87-422a-8cef-507c0c26da57\") " pod="openshift-console/console-64496c466b-btqrw" Apr 24 22:37:47.321471 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:47.321452 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5t8k\" (UniqueName: \"kubernetes.io/projected/bda88ed8-ba87-422a-8cef-507c0c26da57-kube-api-access-s5t8k\") pod \"console-64496c466b-btqrw\" (UID: \"bda88ed8-ba87-422a-8cef-507c0c26da57\") " pod="openshift-console/console-64496c466b-btqrw" Apr 24 22:37:47.333151 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:47.333123 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64496c466b-btqrw" Apr 24 22:37:47.450389 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:47.450358 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64496c466b-btqrw"] Apr 24 22:37:47.453504 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:37:47.453475 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbda88ed8_ba87_422a_8cef_507c0c26da57.slice/crio-ae917c82a1b30e85fcf8eda8451afe94f3fd7bcf7308867d38ebcf034d631abb WatchSource:0}: Error finding container ae917c82a1b30e85fcf8eda8451afe94f3fd7bcf7308867d38ebcf034d631abb: Status 404 returned error can't find the container with id ae917c82a1b30e85fcf8eda8451afe94f3fd7bcf7308867d38ebcf034d631abb Apr 24 22:37:48.035774 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:48.035728 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64496c466b-btqrw" event={"ID":"bda88ed8-ba87-422a-8cef-507c0c26da57","Type":"ContainerStarted","Data":"f858bf281efc58dc51118cdb0dad99ad7418ae9d36d755fa304104f4a4580f8e"} Apr 24 22:37:48.035774 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:48.035777 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64496c466b-btqrw" event={"ID":"bda88ed8-ba87-422a-8cef-507c0c26da57","Type":"ContainerStarted","Data":"ae917c82a1b30e85fcf8eda8451afe94f3fd7bcf7308867d38ebcf034d631abb"} Apr 24 22:37:48.057139 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:48.057085 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64496c466b-btqrw" podStartSLOduration=2.057068733 podStartE2EDuration="2.057068733s" podCreationTimestamp="2026-04-24 22:37:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:37:48.054001094 +0000 UTC m=+475.202306642" watchObservedRunningTime="2026-04-24 22:37:48.057068733 +0000 UTC m=+475.205374282" Apr 24 22:37:53.014526 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:53.014483 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 22:37:53.015052 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:53.014989 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:37:57.333777 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:57.333744 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64496c466b-btqrw" Apr 24 22:37:57.333777 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:57.333780 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-64496c466b-btqrw" Apr 24 22:37:57.339608 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:57.339582 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-64496c466b-btqrw" Apr 24 22:37:58.066420 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:58.066395 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-64496c466b-btqrw" Apr 24 22:37:58.137129 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:37:58.137097 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55bbbfbc94-mrrkz"] Apr 24 22:38:03.014832 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:03.014786 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 22:38:03.015260 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:03.015237 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:38:13.014690 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:13.014633 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 22:38:13.015151 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:13.015064 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:38:23.014636 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:23.014586 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 22:38:23.015153 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:23.015117 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:38:23.156839 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:23.156804 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-55bbbfbc94-mrrkz" podUID="9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b" containerName="console" containerID="cri-o://43998b900a767cee3d9481c7207cef4df6924b11612bfbf8ae4eddd288138b71" gracePeriod=15 Apr 24 22:38:23.387985 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:23.387962 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55bbbfbc94-mrrkz_9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b/console/0.log" Apr 24 22:38:23.388107 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:23.388038 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:38:23.473158 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:23.473132 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-trusted-ca-bundle\") pod \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " Apr 24 22:38:23.473279 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:23.473170 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-console-config\") pod \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " Apr 24 22:38:23.473279 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:23.473188 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt5xb\" (UniqueName: \"kubernetes.io/projected/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-kube-api-access-qt5xb\") pod \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " Apr 24 22:38:23.473279 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:23.473204 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-service-ca\") pod \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " Apr 24 22:38:23.473279 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:23.473240 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-console-serving-cert\") pod \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " Apr 24 22:38:23.473279 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:23.473269 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-oauth-serving-cert\") pod \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " Apr 24 22:38:23.473528 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:23.473309 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-console-oauth-config\") pod \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\" (UID: \"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b\") " Apr 24 22:38:23.473631 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:23.473602 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b" (UID: "9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:38:23.473694 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:23.473530 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-console-config" (OuterVolumeSpecName: "console-config") pod "9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b" (UID: "9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:38:23.473694 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:23.473662 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b" (UID: "9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:38:23.473779 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:23.473682 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-service-ca" (OuterVolumeSpecName: "service-ca") pod "9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b" (UID: "9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:38:23.475306 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:23.475283 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-kube-api-access-qt5xb" (OuterVolumeSpecName: "kube-api-access-qt5xb") pod "9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b" (UID: "9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b"). InnerVolumeSpecName "kube-api-access-qt5xb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:38:23.475431 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:23.475413 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b" (UID: "9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:38:23.475508 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:23.475490 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b" (UID: "9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:38:23.574513 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:23.574462 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-console-serving-cert\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:38:23.574513 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:23.574482 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-oauth-serving-cert\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:38:23.574513 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:23.574492 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-console-oauth-config\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:38:23.574513 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:23.574501 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-trusted-ca-bundle\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:38:23.574513 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:23.574511 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-console-config\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:38:23.574763 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:23.574519 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qt5xb\" (UniqueName: \"kubernetes.io/projected/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-kube-api-access-qt5xb\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:38:23.574763 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:23.574528 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b-service-ca\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:38:24.142370 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:24.142348 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55bbbfbc94-mrrkz_9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b/console/0.log" Apr 24 22:38:24.142804 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:24.142388 2572 generic.go:358] "Generic (PLEG): container finished" podID="9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b" containerID="43998b900a767cee3d9481c7207cef4df6924b11612bfbf8ae4eddd288138b71" exitCode=2 Apr 24 22:38:24.142804 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:24.142420 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55bbbfbc94-mrrkz" event={"ID":"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b","Type":"ContainerDied","Data":"43998b900a767cee3d9481c7207cef4df6924b11612bfbf8ae4eddd288138b71"} Apr 24 22:38:24.142804 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:24.142448 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55bbbfbc94-mrrkz" event={"ID":"9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b","Type":"ContainerDied","Data":"68c9648b502deb556ffd53c8b95564ae98ce1b0396599cb61a39d3f977870bf2"} Apr 24 22:38:24.142804 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:24.142463 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55bbbfbc94-mrrkz" Apr 24 22:38:24.142804 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:24.142466 2572 scope.go:117] "RemoveContainer" containerID="43998b900a767cee3d9481c7207cef4df6924b11612bfbf8ae4eddd288138b71" Apr 24 22:38:24.151362 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:24.151344 2572 scope.go:117] "RemoveContainer" containerID="43998b900a767cee3d9481c7207cef4df6924b11612bfbf8ae4eddd288138b71" Apr 24 22:38:24.151615 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:38:24.151597 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43998b900a767cee3d9481c7207cef4df6924b11612bfbf8ae4eddd288138b71\": container with ID starting with 43998b900a767cee3d9481c7207cef4df6924b11612bfbf8ae4eddd288138b71 not found: ID does not exist" containerID="43998b900a767cee3d9481c7207cef4df6924b11612bfbf8ae4eddd288138b71" Apr 24 22:38:24.151662 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:24.151623 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43998b900a767cee3d9481c7207cef4df6924b11612bfbf8ae4eddd288138b71"} err="failed to get container status \"43998b900a767cee3d9481c7207cef4df6924b11612bfbf8ae4eddd288138b71\": rpc error: code = NotFound desc = could not find container \"43998b900a767cee3d9481c7207cef4df6924b11612bfbf8ae4eddd288138b71\": container with ID starting with 43998b900a767cee3d9481c7207cef4df6924b11612bfbf8ae4eddd288138b71 not found: ID does not exist" Apr 24 22:38:24.164388 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:24.164353 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55bbbfbc94-mrrkz"] Apr 24 22:38:24.165712 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:24.165692 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-55bbbfbc94-mrrkz"] Apr 24 22:38:25.417960 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:25.417927 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b" path="/var/lib/kubelet/pods/9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b/volumes" Apr 24 22:38:33.014521 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:33.014470 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 22:38:33.014922 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:33.014892 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:38:43.015285 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:43.015184 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" Apr 24 22:38:43.015803 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:43.015778 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" Apr 24 22:38:48.912746 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:48.912706 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt"] Apr 24 22:38:48.913261 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:48.913216 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="kserve-container" containerID="cri-o://f52f1857c57c857be065804869ddbd222e1f2742d14e22a785eb5b177b126db0" gracePeriod=30 Apr 24 22:38:48.913406 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:48.913227 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="agent" containerID="cri-o://1bc6b0bcf38fee28c2674025520be03e775fc64480065b2bd7cb0691fb74db1f" gracePeriod=30 Apr 24 22:38:48.913406 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:48.913240 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="kube-rbac-proxy" containerID="cri-o://51d7b30b73122c94f7f99843891b31cbb37254a2d04aaa01c11b3909e65460c0" gracePeriod=30 Apr 24 22:38:49.016165 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.016139 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh"] Apr 24 22:38:49.016614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.016586 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b" containerName="console" Apr 24 22:38:49.016614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.016608 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b" containerName="console" Apr 24 22:38:49.016734 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.016659 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d05ee12-3dd2-4abf-8b9e-4a2f0eceb34b" containerName="console" Apr 24 22:38:49.020022 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.019991 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" Apr 24 22:38:49.023875 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.023854 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-d4694-predictor-serving-cert\"" Apr 24 22:38:49.024240 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.024220 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-d4694-kube-rbac-proxy-sar-config\"" Apr 24 22:38:49.035386 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.035364 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh"] Apr 24 22:38:49.106359 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.106333 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z"] Apr 24 22:38:49.109863 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.109845 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" Apr 24 22:38:49.112458 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.112436 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-d4694-kube-rbac-proxy-sar-config\"" Apr 24 22:38:49.112619 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.112599 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-d4694-predictor-serving-cert\"" Apr 24 22:38:49.117794 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.117773 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z"] Apr 24 22:38:49.170377 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.170311 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d130507f-f6d9-401d-a5e3-1328301efd06-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh\" (UID: \"d130507f-f6d9-401d-a5e3-1328301efd06\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" Apr 24 22:38:49.170489 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.170389 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-d4694-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d130507f-f6d9-401d-a5e3-1328301efd06-isvc-sklearn-graph-raw-d4694-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh\" (UID: \"d130507f-f6d9-401d-a5e3-1328301efd06\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" Apr 24 22:38:49.170489 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.170421 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4vpt\" (UniqueName: \"kubernetes.io/projected/d130507f-f6d9-401d-a5e3-1328301efd06-kube-api-access-k4vpt\") pod \"isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh\" (UID: \"d130507f-f6d9-401d-a5e3-1328301efd06\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" Apr 24 22:38:49.170489 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.170457 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d130507f-f6d9-401d-a5e3-1328301efd06-proxy-tls\") pod \"isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh\" (UID: \"d130507f-f6d9-401d-a5e3-1328301efd06\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" Apr 24 22:38:49.224079 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.224054 2572 generic.go:358] "Generic (PLEG): container finished" podID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerID="51d7b30b73122c94f7f99843891b31cbb37254a2d04aaa01c11b3909e65460c0" exitCode=2 Apr 24 22:38:49.224181 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.224126 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" event={"ID":"5cf881dd-dc1a-4ac1-8717-4cff33fc74be","Type":"ContainerDied","Data":"51d7b30b73122c94f7f99843891b31cbb37254a2d04aaa01c11b3909e65460c0"} Apr 24 22:38:49.271161 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.271137 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-d4694-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bca9a52c-f7a2-4936-a746-fd24af9e0506-isvc-xgboost-graph-raw-d4694-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z\" (UID: \"bca9a52c-f7a2-4936-a746-fd24af9e0506\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" Apr 24 22:38:49.271259 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.271172 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-d4694-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d130507f-f6d9-401d-a5e3-1328301efd06-isvc-sklearn-graph-raw-d4694-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh\" (UID: \"d130507f-f6d9-401d-a5e3-1328301efd06\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" Apr 24 22:38:49.271259 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.271192 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4vpt\" (UniqueName: \"kubernetes.io/projected/d130507f-f6d9-401d-a5e3-1328301efd06-kube-api-access-k4vpt\") pod \"isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh\" (UID: \"d130507f-f6d9-401d-a5e3-1328301efd06\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" Apr 24 22:38:49.271259 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.271227 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d130507f-f6d9-401d-a5e3-1328301efd06-proxy-tls\") pod \"isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh\" (UID: \"d130507f-f6d9-401d-a5e3-1328301efd06\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" Apr 24 22:38:49.271259 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.271252 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg4xs\" (UniqueName: \"kubernetes.io/projected/bca9a52c-f7a2-4936-a746-fd24af9e0506-kube-api-access-rg4xs\") pod \"isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z\" (UID: \"bca9a52c-f7a2-4936-a746-fd24af9e0506\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" Apr 24 22:38:49.271467 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.271283 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bca9a52c-f7a2-4936-a746-fd24af9e0506-proxy-tls\") pod \"isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z\" (UID: \"bca9a52c-f7a2-4936-a746-fd24af9e0506\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" Apr 24 22:38:49.271467 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.271309 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d130507f-f6d9-401d-a5e3-1328301efd06-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh\" (UID: \"d130507f-f6d9-401d-a5e3-1328301efd06\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" Apr 24 22:38:49.271467 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.271348 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bca9a52c-f7a2-4936-a746-fd24af9e0506-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z\" (UID: \"bca9a52c-f7a2-4936-a746-fd24af9e0506\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" Apr 24 22:38:49.271653 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.271631 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d130507f-f6d9-401d-a5e3-1328301efd06-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh\" (UID: \"d130507f-f6d9-401d-a5e3-1328301efd06\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" Apr 24 22:38:49.271815 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.271799 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-d4694-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d130507f-f6d9-401d-a5e3-1328301efd06-isvc-sklearn-graph-raw-d4694-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh\" (UID: \"d130507f-f6d9-401d-a5e3-1328301efd06\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" Apr 24 22:38:49.273536 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.273521 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d130507f-f6d9-401d-a5e3-1328301efd06-proxy-tls\") pod \"isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh\" (UID: \"d130507f-f6d9-401d-a5e3-1328301efd06\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" Apr 24 22:38:49.279179 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.279156 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4vpt\" (UniqueName: \"kubernetes.io/projected/d130507f-f6d9-401d-a5e3-1328301efd06-kube-api-access-k4vpt\") pod \"isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh\" (UID: \"d130507f-f6d9-401d-a5e3-1328301efd06\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" Apr 24 22:38:49.331511 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.331490 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" Apr 24 22:38:49.372332 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.372302 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bca9a52c-f7a2-4936-a746-fd24af9e0506-proxy-tls\") pod \"isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z\" (UID: \"bca9a52c-f7a2-4936-a746-fd24af9e0506\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" Apr 24 22:38:49.372463 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.372364 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bca9a52c-f7a2-4936-a746-fd24af9e0506-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z\" (UID: \"bca9a52c-f7a2-4936-a746-fd24af9e0506\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" Apr 24 22:38:49.372463 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.372420 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-d4694-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bca9a52c-f7a2-4936-a746-fd24af9e0506-isvc-xgboost-graph-raw-d4694-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z\" (UID: \"bca9a52c-f7a2-4936-a746-fd24af9e0506\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" Apr 24 22:38:49.372646 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.372481 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rg4xs\" (UniqueName: \"kubernetes.io/projected/bca9a52c-f7a2-4936-a746-fd24af9e0506-kube-api-access-rg4xs\") pod \"isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z\" (UID: \"bca9a52c-f7a2-4936-a746-fd24af9e0506\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" Apr 24 22:38:49.372858 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.372836 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bca9a52c-f7a2-4936-a746-fd24af9e0506-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z\" (UID: \"bca9a52c-f7a2-4936-a746-fd24af9e0506\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" Apr 24 22:38:49.373106 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.373088 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-d4694-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bca9a52c-f7a2-4936-a746-fd24af9e0506-isvc-xgboost-graph-raw-d4694-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z\" (UID: \"bca9a52c-f7a2-4936-a746-fd24af9e0506\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" Apr 24 22:38:49.374695 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.374673 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bca9a52c-f7a2-4936-a746-fd24af9e0506-proxy-tls\") pod \"isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z\" (UID: \"bca9a52c-f7a2-4936-a746-fd24af9e0506\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" Apr 24 22:38:49.381636 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.381612 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg4xs\" (UniqueName: \"kubernetes.io/projected/bca9a52c-f7a2-4936-a746-fd24af9e0506-kube-api-access-rg4xs\") pod \"isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z\" (UID: \"bca9a52c-f7a2-4936-a746-fd24af9e0506\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" Apr 24 22:38:49.421288 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.421225 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" Apr 24 22:38:49.452806 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.452781 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh"] Apr 24 22:38:49.455233 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:38:49.455208 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd130507f_f6d9_401d_a5e3_1328301efd06.slice/crio-b0919fce0541381c7b7785d07063d28ada86c17a22540c2cdf23d2499378ec64 WatchSource:0}: Error finding container b0919fce0541381c7b7785d07063d28ada86c17a22540c2cdf23d2499378ec64: Status 404 returned error can't find the container with id b0919fce0541381c7b7785d07063d28ada86c17a22540c2cdf23d2499378ec64 Apr 24 22:38:49.549268 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:49.549241 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z"] Apr 24 22:38:49.550319 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:38:49.550291 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbca9a52c_f7a2_4936_a746_fd24af9e0506.slice/crio-55155f1a8f514819bbbe6048f8c2755b13a29ac6eb163476f00c66cc8c80ec9d WatchSource:0}: Error finding container 55155f1a8f514819bbbe6048f8c2755b13a29ac6eb163476f00c66cc8c80ec9d: Status 404 returned error can't find the container with id 55155f1a8f514819bbbe6048f8c2755b13a29ac6eb163476f00c66cc8c80ec9d Apr 24 22:38:50.228752 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:50.228719 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" event={"ID":"bca9a52c-f7a2-4936-a746-fd24af9e0506","Type":"ContainerStarted","Data":"ff769907b8ad607cef0c3df449b96304ffd6b0d3b57e5bf883ac385c4c7195b8"} Apr 24 22:38:50.228752 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:50.228752 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" event={"ID":"bca9a52c-f7a2-4936-a746-fd24af9e0506","Type":"ContainerStarted","Data":"55155f1a8f514819bbbe6048f8c2755b13a29ac6eb163476f00c66cc8c80ec9d"} Apr 24 22:38:50.230067 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:50.230039 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" event={"ID":"d130507f-f6d9-401d-a5e3-1328301efd06","Type":"ContainerStarted","Data":"a22d474966db80852e05087584ed17517db7ff7ec7f3e1dbbe51de3194e5d817"} Apr 24 22:38:50.230186 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:50.230073 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" event={"ID":"d130507f-f6d9-401d-a5e3-1328301efd06","Type":"ContainerStarted","Data":"b0919fce0541381c7b7785d07063d28ada86c17a22540c2cdf23d2499378ec64"} Apr 24 22:38:52.012122 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:52.012081 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.24:8643/healthz\": dial tcp 10.134.0.24:8643: connect: connection refused" Apr 24 22:38:53.014871 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:53.014831 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 22:38:53.016538 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:53.016504 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:38:53.243242 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:53.243209 2572 generic.go:358] "Generic (PLEG): container finished" podID="d130507f-f6d9-401d-a5e3-1328301efd06" containerID="a22d474966db80852e05087584ed17517db7ff7ec7f3e1dbbe51de3194e5d817" exitCode=0 Apr 24 22:38:53.243343 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:53.243280 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" event={"ID":"d130507f-f6d9-401d-a5e3-1328301efd06","Type":"ContainerDied","Data":"a22d474966db80852e05087584ed17517db7ff7ec7f3e1dbbe51de3194e5d817"} Apr 24 22:38:53.245518 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:53.245499 2572 generic.go:358] "Generic (PLEG): container finished" podID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerID="f52f1857c57c857be065804869ddbd222e1f2742d14e22a785eb5b177b126db0" exitCode=0 Apr 24 22:38:53.245619 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:53.245566 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" event={"ID":"5cf881dd-dc1a-4ac1-8717-4cff33fc74be","Type":"ContainerDied","Data":"f52f1857c57c857be065804869ddbd222e1f2742d14e22a785eb5b177b126db0"} Apr 24 22:38:54.249558 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:54.249523 2572 generic.go:358] "Generic (PLEG): container finished" podID="bca9a52c-f7a2-4936-a746-fd24af9e0506" containerID="ff769907b8ad607cef0c3df449b96304ffd6b0d3b57e5bf883ac385c4c7195b8" exitCode=0 Apr 24 22:38:54.250048 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:54.249607 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" event={"ID":"bca9a52c-f7a2-4936-a746-fd24af9e0506","Type":"ContainerDied","Data":"ff769907b8ad607cef0c3df449b96304ffd6b0d3b57e5bf883ac385c4c7195b8"} Apr 24 22:38:54.251542 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:54.251502 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" event={"ID":"d130507f-f6d9-401d-a5e3-1328301efd06","Type":"ContainerStarted","Data":"65f84c9158f2789717fc0dbab3ddf28e8d36549520c1f3c319131b101d96b1d0"} Apr 24 22:38:54.251542 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:54.251530 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" event={"ID":"d130507f-f6d9-401d-a5e3-1328301efd06","Type":"ContainerStarted","Data":"d6b5626cf9fc45f031f68f62b077186cc3b084a28fea6896c25aad37a5f29bf5"} Apr 24 22:38:54.251834 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:54.251813 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" Apr 24 22:38:54.251936 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:54.251853 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" Apr 24 22:38:54.252873 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:54.252849 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" podUID="d130507f-f6d9-401d-a5e3-1328301efd06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 22:38:54.290943 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:54.290899 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" podStartSLOduration=6.290884526 podStartE2EDuration="6.290884526s" podCreationTimestamp="2026-04-24 22:38:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:38:54.28915591 +0000 UTC m=+541.437461457" watchObservedRunningTime="2026-04-24 22:38:54.290884526 +0000 UTC m=+541.439190074" Apr 24 22:38:55.255955 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:55.255917 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" podUID="d130507f-f6d9-401d-a5e3-1328301efd06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 22:38:57.012617 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:38:57.012565 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.24:8643/healthz\": dial tcp 10.134.0.24:8643: connect: connection refused" Apr 24 22:39:00.261210 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:00.261182 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" Apr 24 22:39:00.261768 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:00.261740 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" podUID="d130507f-f6d9-401d-a5e3-1328301efd06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 22:39:02.012742 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:02.012700 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.24:8643/healthz\": dial tcp 10.134.0.24:8643: connect: connection refused" Apr 24 22:39:02.013188 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:02.012851 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" Apr 24 22:39:03.014897 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:03.014851 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 22:39:03.016482 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:03.016450 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:39:07.012731 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:07.012690 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.24:8643/healthz\": dial tcp 10.134.0.24:8643: connect: connection refused" Apr 24 22:39:10.262155 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:10.262110 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" podUID="d130507f-f6d9-401d-a5e3-1328301efd06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 22:39:12.012931 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:12.012883 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.24:8643/healthz\": dial tcp 10.134.0.24:8643: connect: connection refused" Apr 24 22:39:13.014912 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:13.014881 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 22:39:13.015228 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:13.015031 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" Apr 24 22:39:13.016215 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:13.016192 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:39:13.016319 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:13.016309 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" Apr 24 22:39:13.315002 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:13.314975 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" event={"ID":"bca9a52c-f7a2-4936-a746-fd24af9e0506","Type":"ContainerStarted","Data":"c3611a5601f633660803495f08fdafbe1b4c55e959209dc7ee58791722c76573"} Apr 24 22:39:13.315156 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:13.315029 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" event={"ID":"bca9a52c-f7a2-4936-a746-fd24af9e0506","Type":"ContainerStarted","Data":"27317ee53e7dcde9bb7486e993361d8ecf35e52490e0a944c365706e48c01b89"} Apr 24 22:39:13.315378 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:13.315358 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" Apr 24 22:39:13.335472 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:13.335430 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" podStartSLOduration=5.653359381 podStartE2EDuration="24.335416633s" podCreationTimestamp="2026-04-24 22:38:49 +0000 UTC" firstStartedPulling="2026-04-24 22:38:54.251099627 +0000 UTC m=+541.399405155" lastFinishedPulling="2026-04-24 22:39:12.933156871 +0000 UTC m=+560.081462407" observedRunningTime="2026-04-24 22:39:13.334320249 +0000 UTC m=+560.482625815" watchObservedRunningTime="2026-04-24 22:39:13.335416633 +0000 UTC m=+560.483722182" Apr 24 22:39:14.318698 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:14.318665 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" Apr 24 22:39:14.319799 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:14.319769 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" podUID="bca9a52c-f7a2-4936-a746-fd24af9e0506" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 24 22:39:15.321068 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:15.320999 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" podUID="bca9a52c-f7a2-4936-a746-fd24af9e0506" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 24 22:39:17.012719 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:17.012678 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.24:8643/healthz\": dial tcp 10.134.0.24:8643: connect: connection refused" Apr 24 22:39:19.336708 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:19.336680 2572 generic.go:358] "Generic (PLEG): container finished" podID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerID="1bc6b0bcf38fee28c2674025520be03e775fc64480065b2bd7cb0691fb74db1f" exitCode=0 Apr 24 22:39:19.337133 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:19.336741 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" event={"ID":"5cf881dd-dc1a-4ac1-8717-4cff33fc74be","Type":"ContainerDied","Data":"1bc6b0bcf38fee28c2674025520be03e775fc64480065b2bd7cb0691fb74db1f"} Apr 24 22:39:19.574474 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:19.574446 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" Apr 24 22:39:19.626791 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:19.626769 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-raw-sklearn-batcher-dd47d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5cf881dd-dc1a-4ac1-8717-4cff33fc74be-isvc-raw-sklearn-batcher-dd47d-kube-rbac-proxy-sar-config\") pod \"5cf881dd-dc1a-4ac1-8717-4cff33fc74be\" (UID: \"5cf881dd-dc1a-4ac1-8717-4cff33fc74be\") " Apr 24 22:39:19.626894 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:19.626808 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frz59\" (UniqueName: \"kubernetes.io/projected/5cf881dd-dc1a-4ac1-8717-4cff33fc74be-kube-api-access-frz59\") pod \"5cf881dd-dc1a-4ac1-8717-4cff33fc74be\" (UID: \"5cf881dd-dc1a-4ac1-8717-4cff33fc74be\") " Apr 24 22:39:19.626894 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:19.626835 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cf881dd-dc1a-4ac1-8717-4cff33fc74be-proxy-tls\") pod \"5cf881dd-dc1a-4ac1-8717-4cff33fc74be\" (UID: \"5cf881dd-dc1a-4ac1-8717-4cff33fc74be\") " Apr 24 22:39:19.626894 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:19.626881 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5cf881dd-dc1a-4ac1-8717-4cff33fc74be-kserve-provision-location\") pod \"5cf881dd-dc1a-4ac1-8717-4cff33fc74be\" (UID: \"5cf881dd-dc1a-4ac1-8717-4cff33fc74be\") " Apr 24 22:39:19.627243 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:19.627212 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cf881dd-dc1a-4ac1-8717-4cff33fc74be-isvc-raw-sklearn-batcher-dd47d-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-raw-sklearn-batcher-dd47d-kube-rbac-proxy-sar-config") pod "5cf881dd-dc1a-4ac1-8717-4cff33fc74be" (UID: "5cf881dd-dc1a-4ac1-8717-4cff33fc74be"). InnerVolumeSpecName "isvc-raw-sklearn-batcher-dd47d-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:39:19.627339 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:19.627236 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cf881dd-dc1a-4ac1-8717-4cff33fc74be-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5cf881dd-dc1a-4ac1-8717-4cff33fc74be" (UID: "5cf881dd-dc1a-4ac1-8717-4cff33fc74be"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:39:19.628943 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:19.628922 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cf881dd-dc1a-4ac1-8717-4cff33fc74be-kube-api-access-frz59" (OuterVolumeSpecName: "kube-api-access-frz59") pod "5cf881dd-dc1a-4ac1-8717-4cff33fc74be" (UID: "5cf881dd-dc1a-4ac1-8717-4cff33fc74be"). InnerVolumeSpecName "kube-api-access-frz59". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:39:19.629088 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:19.628954 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cf881dd-dc1a-4ac1-8717-4cff33fc74be-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5cf881dd-dc1a-4ac1-8717-4cff33fc74be" (UID: "5cf881dd-dc1a-4ac1-8717-4cff33fc74be"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:39:19.728512 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:19.728490 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5cf881dd-dc1a-4ac1-8717-4cff33fc74be-kserve-provision-location\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:39:19.728512 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:19.728512 2572 reconciler_common.go:299] "Volume detached for volume \"isvc-raw-sklearn-batcher-dd47d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5cf881dd-dc1a-4ac1-8717-4cff33fc74be-isvc-raw-sklearn-batcher-dd47d-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:39:19.728644 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:19.728522 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-frz59\" (UniqueName: \"kubernetes.io/projected/5cf881dd-dc1a-4ac1-8717-4cff33fc74be-kube-api-access-frz59\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:39:19.728644 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:19.728533 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cf881dd-dc1a-4ac1-8717-4cff33fc74be-proxy-tls\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:39:20.262243 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:20.262206 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" podUID="d130507f-f6d9-401d-a5e3-1328301efd06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 22:39:20.324571 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:20.324548 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" Apr 24 22:39:20.325158 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:20.325128 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" podUID="bca9a52c-f7a2-4936-a746-fd24af9e0506" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 24 22:39:20.342625 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:20.342596 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" event={"ID":"5cf881dd-dc1a-4ac1-8717-4cff33fc74be","Type":"ContainerDied","Data":"91fbd3da666542b7ab6fe432892577f9af5dcef6c2c6b331d53cc1486b1d7854"} Apr 24 22:39:20.343006 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:20.342643 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt" Apr 24 22:39:20.343006 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:20.342651 2572 scope.go:117] "RemoveContainer" containerID="1bc6b0bcf38fee28c2674025520be03e775fc64480065b2bd7cb0691fb74db1f" Apr 24 22:39:20.351102 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:20.351083 2572 scope.go:117] "RemoveContainer" containerID="51d7b30b73122c94f7f99843891b31cbb37254a2d04aaa01c11b3909e65460c0" Apr 24 22:39:20.359718 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:20.359692 2572 scope.go:117] "RemoveContainer" containerID="f52f1857c57c857be065804869ddbd222e1f2742d14e22a785eb5b177b126db0" Apr 24 22:39:20.367130 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:20.367104 2572 scope.go:117] "RemoveContainer" containerID="fd397ea13b6d79d0ab05331542134b0fe30018ae66b7e01c268a465cceaa6744" Apr 24 22:39:20.370916 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:20.370895 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt"] Apr 24 22:39:20.374938 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:20.374921 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dd47d-predictor-7bdbd56696-xnfjt"] Apr 24 22:39:21.417288 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:21.417256 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" path="/var/lib/kubelet/pods/5cf881dd-dc1a-4ac1-8717-4cff33fc74be/volumes" Apr 24 22:39:30.261777 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:30.261741 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" podUID="d130507f-f6d9-401d-a5e3-1328301efd06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 22:39:30.325449 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:30.325418 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" podUID="bca9a52c-f7a2-4936-a746-fd24af9e0506" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 24 22:39:40.262608 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:40.262570 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" podUID="d130507f-f6d9-401d-a5e3-1328301efd06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 22:39:40.325607 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:40.325569 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" podUID="bca9a52c-f7a2-4936-a746-fd24af9e0506" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 24 22:39:50.261973 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:50.261927 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" podUID="d130507f-f6d9-401d-a5e3-1328301efd06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 22:39:50.325040 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:39:50.324996 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" podUID="bca9a52c-f7a2-4936-a746-fd24af9e0506" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 24 22:40:00.263212 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:00.263183 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" Apr 24 22:40:00.325854 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:00.325817 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" podUID="bca9a52c-f7a2-4936-a746-fd24af9e0506" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 24 22:40:10.325626 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:10.325553 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" Apr 24 22:40:29.237180 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.237148 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh"] Apr 24 22:40:29.237571 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.237456 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" podUID="d130507f-f6d9-401d-a5e3-1328301efd06" containerName="kserve-container" containerID="cri-o://d6b5626cf9fc45f031f68f62b077186cc3b084a28fea6896c25aad37a5f29bf5" gracePeriod=30 Apr 24 22:40:29.237571 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.237516 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" podUID="d130507f-f6d9-401d-a5e3-1328301efd06" containerName="kube-rbac-proxy" containerID="cri-o://65f84c9158f2789717fc0dbab3ddf28e8d36549520c1f3c319131b101d96b1d0" gracePeriod=30 Apr 24 22:40:29.293339 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.293313 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c"] Apr 24 22:40:29.293735 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.293719 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="kube-rbac-proxy" Apr 24 22:40:29.293823 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.293737 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="kube-rbac-proxy" Apr 24 22:40:29.293823 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.293766 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="kserve-container" Apr 24 22:40:29.293823 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.293775 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="kserve-container" Apr 24 22:40:29.293823 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.293788 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="storage-initializer" Apr 24 22:40:29.293823 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.293796 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="storage-initializer" Apr 24 22:40:29.293823 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.293813 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="agent" Apr 24 22:40:29.293823 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.293821 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="agent" Apr 24 22:40:29.294179 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.293901 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="kube-rbac-proxy" Apr 24 22:40:29.294179 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.293921 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="agent" Apr 24 22:40:29.294179 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.293933 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="5cf881dd-dc1a-4ac1-8717-4cff33fc74be" containerName="kserve-container" Apr 24 22:40:29.296918 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.296900 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" Apr 24 22:40:29.299750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.299720 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-133b5-predictor-serving-cert\"" Apr 24 22:40:29.299846 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.299722 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config\"" Apr 24 22:40:29.308955 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.308935 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c"] Apr 24 22:40:29.347741 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.347717 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92a467b0-f71f-4b28-b810-ef6cfad58340-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c\" (UID: \"92a467b0-f71f-4b28-b810-ef6cfad58340\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" Apr 24 22:40:29.347853 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.347759 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/92a467b0-f71f-4b28-b810-ef6cfad58340-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c\" (UID: \"92a467b0-f71f-4b28-b810-ef6cfad58340\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" Apr 24 22:40:29.347899 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.347848 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/92a467b0-f71f-4b28-b810-ef6cfad58340-isvc-sklearn-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c\" (UID: \"92a467b0-f71f-4b28-b810-ef6cfad58340\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" Apr 24 22:40:29.347990 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.347970 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv4wk\" (UniqueName: \"kubernetes.io/projected/92a467b0-f71f-4b28-b810-ef6cfad58340-kube-api-access-cv4wk\") pod \"isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c\" (UID: \"92a467b0-f71f-4b28-b810-ef6cfad58340\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" Apr 24 22:40:29.404809 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.404783 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z"] Apr 24 22:40:29.405111 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.405089 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" podUID="bca9a52c-f7a2-4936-a746-fd24af9e0506" containerName="kserve-container" containerID="cri-o://27317ee53e7dcde9bb7486e993361d8ecf35e52490e0a944c365706e48c01b89" gracePeriod=30 Apr 24 22:40:29.405197 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.405131 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" podUID="bca9a52c-f7a2-4936-a746-fd24af9e0506" containerName="kube-rbac-proxy" containerID="cri-o://c3611a5601f633660803495f08fdafbe1b4c55e959209dc7ee58791722c76573" gracePeriod=30 Apr 24 22:40:29.446916 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.446891 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946"] Apr 24 22:40:29.448360 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.448336 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92a467b0-f71f-4b28-b810-ef6cfad58340-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c\" (UID: \"92a467b0-f71f-4b28-b810-ef6cfad58340\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" Apr 24 22:40:29.448424 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.448383 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/92a467b0-f71f-4b28-b810-ef6cfad58340-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c\" (UID: \"92a467b0-f71f-4b28-b810-ef6cfad58340\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" Apr 24 22:40:29.448466 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.448433 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/92a467b0-f71f-4b28-b810-ef6cfad58340-isvc-sklearn-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c\" (UID: \"92a467b0-f71f-4b28-b810-ef6cfad58340\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" Apr 24 22:40:29.448533 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.448516 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cv4wk\" (UniqueName: \"kubernetes.io/projected/92a467b0-f71f-4b28-b810-ef6cfad58340-kube-api-access-cv4wk\") pod \"isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c\" (UID: \"92a467b0-f71f-4b28-b810-ef6cfad58340\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" Apr 24 22:40:29.448778 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.448746 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92a467b0-f71f-4b28-b810-ef6cfad58340-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c\" (UID: \"92a467b0-f71f-4b28-b810-ef6cfad58340\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" Apr 24 22:40:29.449085 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.449066 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/92a467b0-f71f-4b28-b810-ef6cfad58340-isvc-sklearn-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c\" (UID: \"92a467b0-f71f-4b28-b810-ef6cfad58340\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" Apr 24 22:40:29.450652 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.450635 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/92a467b0-f71f-4b28-b810-ef6cfad58340-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c\" (UID: \"92a467b0-f71f-4b28-b810-ef6cfad58340\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" Apr 24 22:40:29.450736 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.450719 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" Apr 24 22:40:29.453142 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.453122 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-133b5-predictor-serving-cert\"" Apr 24 22:40:29.453234 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.453125 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config\"" Apr 24 22:40:29.456034 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.455998 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv4wk\" (UniqueName: \"kubernetes.io/projected/92a467b0-f71f-4b28-b810-ef6cfad58340-kube-api-access-cv4wk\") pod \"isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c\" (UID: \"92a467b0-f71f-4b28-b810-ef6cfad58340\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" Apr 24 22:40:29.460984 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.460967 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946"] Apr 24 22:40:29.548987 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.548962 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5cae113f-ed3f-4799-98a3-0804da8ad90f-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946\" (UID: \"5cae113f-ed3f-4799-98a3-0804da8ad90f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" Apr 24 22:40:29.549137 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.549040 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5cae113f-ed3f-4799-98a3-0804da8ad90f-isvc-xgboost-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946\" (UID: \"5cae113f-ed3f-4799-98a3-0804da8ad90f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" Apr 24 22:40:29.549137 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.549073 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98wp7\" (UniqueName: \"kubernetes.io/projected/5cae113f-ed3f-4799-98a3-0804da8ad90f-kube-api-access-98wp7\") pod \"isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946\" (UID: \"5cae113f-ed3f-4799-98a3-0804da8ad90f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" Apr 24 22:40:29.549137 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.549115 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cae113f-ed3f-4799-98a3-0804da8ad90f-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946\" (UID: \"5cae113f-ed3f-4799-98a3-0804da8ad90f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" Apr 24 22:40:29.551193 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.551169 2572 generic.go:358] "Generic (PLEG): container finished" podID="bca9a52c-f7a2-4936-a746-fd24af9e0506" containerID="c3611a5601f633660803495f08fdafbe1b4c55e959209dc7ee58791722c76573" exitCode=2 Apr 24 22:40:29.551291 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.551246 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" event={"ID":"bca9a52c-f7a2-4936-a746-fd24af9e0506","Type":"ContainerDied","Data":"c3611a5601f633660803495f08fdafbe1b4c55e959209dc7ee58791722c76573"} Apr 24 22:40:29.552817 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.552799 2572 generic.go:358] "Generic (PLEG): container finished" podID="d130507f-f6d9-401d-a5e3-1328301efd06" containerID="65f84c9158f2789717fc0dbab3ddf28e8d36549520c1f3c319131b101d96b1d0" exitCode=2 Apr 24 22:40:29.552897 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.552829 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" event={"ID":"d130507f-f6d9-401d-a5e3-1328301efd06","Type":"ContainerDied","Data":"65f84c9158f2789717fc0dbab3ddf28e8d36549520c1f3c319131b101d96b1d0"} Apr 24 22:40:29.609003 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.608981 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" Apr 24 22:40:29.650079 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.650035 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5cae113f-ed3f-4799-98a3-0804da8ad90f-isvc-xgboost-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946\" (UID: \"5cae113f-ed3f-4799-98a3-0804da8ad90f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" Apr 24 22:40:29.650230 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.650097 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98wp7\" (UniqueName: \"kubernetes.io/projected/5cae113f-ed3f-4799-98a3-0804da8ad90f-kube-api-access-98wp7\") pod \"isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946\" (UID: \"5cae113f-ed3f-4799-98a3-0804da8ad90f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" Apr 24 22:40:29.650230 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.650161 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cae113f-ed3f-4799-98a3-0804da8ad90f-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946\" (UID: \"5cae113f-ed3f-4799-98a3-0804da8ad90f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" Apr 24 22:40:29.650230 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.650205 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5cae113f-ed3f-4799-98a3-0804da8ad90f-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946\" (UID: \"5cae113f-ed3f-4799-98a3-0804da8ad90f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" Apr 24 22:40:29.650496 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.650470 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5cae113f-ed3f-4799-98a3-0804da8ad90f-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946\" (UID: \"5cae113f-ed3f-4799-98a3-0804da8ad90f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" Apr 24 22:40:29.650719 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.650698 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5cae113f-ed3f-4799-98a3-0804da8ad90f-isvc-xgboost-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946\" (UID: \"5cae113f-ed3f-4799-98a3-0804da8ad90f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" Apr 24 22:40:29.653994 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.653548 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cae113f-ed3f-4799-98a3-0804da8ad90f-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946\" (UID: \"5cae113f-ed3f-4799-98a3-0804da8ad90f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" Apr 24 22:40:29.660154 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.660129 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-98wp7\" (UniqueName: \"kubernetes.io/projected/5cae113f-ed3f-4799-98a3-0804da8ad90f-kube-api-access-98wp7\") pod \"isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946\" (UID: \"5cae113f-ed3f-4799-98a3-0804da8ad90f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" Apr 24 22:40:29.724239 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.722136 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c"] Apr 24 22:40:29.725560 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:40:29.725530 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92a467b0_f71f_4b28_b810_ef6cfad58340.slice/crio-be2ce0bf82a7492c8831b1e0eac288089b5fbbb9cf7b5a467770d2bdd055f536 WatchSource:0}: Error finding container be2ce0bf82a7492c8831b1e0eac288089b5fbbb9cf7b5a467770d2bdd055f536: Status 404 returned error can't find the container with id be2ce0bf82a7492c8831b1e0eac288089b5fbbb9cf7b5a467770d2bdd055f536 Apr 24 22:40:29.773096 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.773074 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" Apr 24 22:40:29.892260 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:29.892237 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946"] Apr 24 22:40:29.894674 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:40:29.894645 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cae113f_ed3f_4799_98a3_0804da8ad90f.slice/crio-771f27871ae0aac2ae75e7a52f82eab3a36d1f0a744dc3aa6ff8345e18b81dda WatchSource:0}: Error finding container 771f27871ae0aac2ae75e7a52f82eab3a36d1f0a744dc3aa6ff8345e18b81dda: Status 404 returned error can't find the container with id 771f27871ae0aac2ae75e7a52f82eab3a36d1f0a744dc3aa6ff8345e18b81dda Apr 24 22:40:30.256239 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:30.256194 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" podUID="d130507f-f6d9-401d-a5e3-1328301efd06" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.26:8643/healthz\": dial tcp 10.134.0.26:8643: connect: connection refused" Apr 24 22:40:30.262002 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:30.261967 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" podUID="d130507f-f6d9-401d-a5e3-1328301efd06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 22:40:30.321638 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:30.321600 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" podUID="bca9a52c-f7a2-4936-a746-fd24af9e0506" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.27:8643/healthz\": dial tcp 10.134.0.27:8643: connect: connection refused" Apr 24 22:40:30.325498 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:30.325462 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" podUID="bca9a52c-f7a2-4936-a746-fd24af9e0506" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 24 22:40:30.558540 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:30.558444 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" event={"ID":"92a467b0-f71f-4b28-b810-ef6cfad58340","Type":"ContainerStarted","Data":"60b2ec7ac41d0c4c24b8017125fae7f37d356d7daf8acbfa766171042a32e556"} Apr 24 22:40:30.558540 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:30.558494 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" event={"ID":"92a467b0-f71f-4b28-b810-ef6cfad58340","Type":"ContainerStarted","Data":"be2ce0bf82a7492c8831b1e0eac288089b5fbbb9cf7b5a467770d2bdd055f536"} Apr 24 22:40:30.559717 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:30.559694 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" event={"ID":"5cae113f-ed3f-4799-98a3-0804da8ad90f","Type":"ContainerStarted","Data":"b2b2bcc44ea747b9bc4f8e736f5a3577379d95d7d88cb086a17c468217244160"} Apr 24 22:40:30.559717 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:30.559721 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" event={"ID":"5cae113f-ed3f-4799-98a3-0804da8ad90f","Type":"ContainerStarted","Data":"771f27871ae0aac2ae75e7a52f82eab3a36d1f0a744dc3aa6ff8345e18b81dda"} Apr 24 22:40:32.735657 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:32.735634 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" Apr 24 22:40:32.775913 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:32.775892 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bca9a52c-f7a2-4936-a746-fd24af9e0506-kserve-provision-location\") pod \"bca9a52c-f7a2-4936-a746-fd24af9e0506\" (UID: \"bca9a52c-f7a2-4936-a746-fd24af9e0506\") " Apr 24 22:40:32.776040 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:32.775973 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg4xs\" (UniqueName: \"kubernetes.io/projected/bca9a52c-f7a2-4936-a746-fd24af9e0506-kube-api-access-rg4xs\") pod \"bca9a52c-f7a2-4936-a746-fd24af9e0506\" (UID: \"bca9a52c-f7a2-4936-a746-fd24af9e0506\") " Apr 24 22:40:32.776040 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:32.775997 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-d4694-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bca9a52c-f7a2-4936-a746-fd24af9e0506-isvc-xgboost-graph-raw-d4694-kube-rbac-proxy-sar-config\") pod \"bca9a52c-f7a2-4936-a746-fd24af9e0506\" (UID: \"bca9a52c-f7a2-4936-a746-fd24af9e0506\") " Apr 24 22:40:32.776158 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:32.776104 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bca9a52c-f7a2-4936-a746-fd24af9e0506-proxy-tls\") pod \"bca9a52c-f7a2-4936-a746-fd24af9e0506\" (UID: \"bca9a52c-f7a2-4936-a746-fd24af9e0506\") " Apr 24 22:40:32.776220 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:32.776198 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bca9a52c-f7a2-4936-a746-fd24af9e0506-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bca9a52c-f7a2-4936-a746-fd24af9e0506" (UID: "bca9a52c-f7a2-4936-a746-fd24af9e0506"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:40:32.776283 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:32.776264 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bca9a52c-f7a2-4936-a746-fd24af9e0506-isvc-xgboost-graph-raw-d4694-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-d4694-kube-rbac-proxy-sar-config") pod "bca9a52c-f7a2-4936-a746-fd24af9e0506" (UID: "bca9a52c-f7a2-4936-a746-fd24af9e0506"). InnerVolumeSpecName "isvc-xgboost-graph-raw-d4694-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:40:32.776423 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:32.776408 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bca9a52c-f7a2-4936-a746-fd24af9e0506-kserve-provision-location\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:40:32.776475 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:32.776431 2572 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-d4694-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bca9a52c-f7a2-4936-a746-fd24af9e0506-isvc-xgboost-graph-raw-d4694-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:40:32.777889 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:32.777872 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bca9a52c-f7a2-4936-a746-fd24af9e0506-kube-api-access-rg4xs" (OuterVolumeSpecName: "kube-api-access-rg4xs") pod "bca9a52c-f7a2-4936-a746-fd24af9e0506" (UID: "bca9a52c-f7a2-4936-a746-fd24af9e0506"). InnerVolumeSpecName "kube-api-access-rg4xs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:40:32.777976 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:32.777961 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bca9a52c-f7a2-4936-a746-fd24af9e0506-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bca9a52c-f7a2-4936-a746-fd24af9e0506" (UID: "bca9a52c-f7a2-4936-a746-fd24af9e0506"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:40:32.877570 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:32.877516 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rg4xs\" (UniqueName: \"kubernetes.io/projected/bca9a52c-f7a2-4936-a746-fd24af9e0506-kube-api-access-rg4xs\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:40:32.877570 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:32.877540 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bca9a52c-f7a2-4936-a746-fd24af9e0506-proxy-tls\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:40:33.067693 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.067674 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" Apr 24 22:40:33.179849 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.179789 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4vpt\" (UniqueName: \"kubernetes.io/projected/d130507f-f6d9-401d-a5e3-1328301efd06-kube-api-access-k4vpt\") pod \"d130507f-f6d9-401d-a5e3-1328301efd06\" (UID: \"d130507f-f6d9-401d-a5e3-1328301efd06\") " Apr 24 22:40:33.179967 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.179881 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d130507f-f6d9-401d-a5e3-1328301efd06-kserve-provision-location\") pod \"d130507f-f6d9-401d-a5e3-1328301efd06\" (UID: \"d130507f-f6d9-401d-a5e3-1328301efd06\") " Apr 24 22:40:33.179967 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.179915 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-d4694-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d130507f-f6d9-401d-a5e3-1328301efd06-isvc-sklearn-graph-raw-d4694-kube-rbac-proxy-sar-config\") pod \"d130507f-f6d9-401d-a5e3-1328301efd06\" (UID: \"d130507f-f6d9-401d-a5e3-1328301efd06\") " Apr 24 22:40:33.180099 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.179971 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d130507f-f6d9-401d-a5e3-1328301efd06-proxy-tls\") pod \"d130507f-f6d9-401d-a5e3-1328301efd06\" (UID: \"d130507f-f6d9-401d-a5e3-1328301efd06\") " Apr 24 22:40:33.180235 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.180207 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d130507f-f6d9-401d-a5e3-1328301efd06-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d130507f-f6d9-401d-a5e3-1328301efd06" (UID: "d130507f-f6d9-401d-a5e3-1328301efd06"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:40:33.180311 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.180284 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d130507f-f6d9-401d-a5e3-1328301efd06-isvc-sklearn-graph-raw-d4694-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-d4694-kube-rbac-proxy-sar-config") pod "d130507f-f6d9-401d-a5e3-1328301efd06" (UID: "d130507f-f6d9-401d-a5e3-1328301efd06"). InnerVolumeSpecName "isvc-sklearn-graph-raw-d4694-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:40:33.181724 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.181702 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d130507f-f6d9-401d-a5e3-1328301efd06-kube-api-access-k4vpt" (OuterVolumeSpecName: "kube-api-access-k4vpt") pod "d130507f-f6d9-401d-a5e3-1328301efd06" (UID: "d130507f-f6d9-401d-a5e3-1328301efd06"). InnerVolumeSpecName "kube-api-access-k4vpt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:40:33.181875 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.181857 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d130507f-f6d9-401d-a5e3-1328301efd06-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d130507f-f6d9-401d-a5e3-1328301efd06" (UID: "d130507f-f6d9-401d-a5e3-1328301efd06"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:40:33.280844 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.280823 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k4vpt\" (UniqueName: \"kubernetes.io/projected/d130507f-f6d9-401d-a5e3-1328301efd06-kube-api-access-k4vpt\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:40:33.280844 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.280843 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d130507f-f6d9-401d-a5e3-1328301efd06-kserve-provision-location\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:40:33.280965 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.280854 2572 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-d4694-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d130507f-f6d9-401d-a5e3-1328301efd06-isvc-sklearn-graph-raw-d4694-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:40:33.280965 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.280864 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d130507f-f6d9-401d-a5e3-1328301efd06-proxy-tls\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:40:33.570890 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.570862 2572 generic.go:358] "Generic (PLEG): container finished" podID="bca9a52c-f7a2-4936-a746-fd24af9e0506" containerID="27317ee53e7dcde9bb7486e993361d8ecf35e52490e0a944c365706e48c01b89" exitCode=0 Apr 24 22:40:33.571005 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.570932 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" Apr 24 22:40:33.571005 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.570953 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" event={"ID":"bca9a52c-f7a2-4936-a746-fd24af9e0506","Type":"ContainerDied","Data":"27317ee53e7dcde9bb7486e993361d8ecf35e52490e0a944c365706e48c01b89"} Apr 24 22:40:33.571005 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.570997 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z" event={"ID":"bca9a52c-f7a2-4936-a746-fd24af9e0506","Type":"ContainerDied","Data":"55155f1a8f514819bbbe6048f8c2755b13a29ac6eb163476f00c66cc8c80ec9d"} Apr 24 22:40:33.571185 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.571030 2572 scope.go:117] "RemoveContainer" containerID="c3611a5601f633660803495f08fdafbe1b4c55e959209dc7ee58791722c76573" Apr 24 22:40:33.572587 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.572563 2572 generic.go:358] "Generic (PLEG): container finished" podID="d130507f-f6d9-401d-a5e3-1328301efd06" containerID="d6b5626cf9fc45f031f68f62b077186cc3b084a28fea6896c25aad37a5f29bf5" exitCode=0 Apr 24 22:40:33.572681 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.572652 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" Apr 24 22:40:33.572751 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.572646 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" event={"ID":"d130507f-f6d9-401d-a5e3-1328301efd06","Type":"ContainerDied","Data":"d6b5626cf9fc45f031f68f62b077186cc3b084a28fea6896c25aad37a5f29bf5"} Apr 24 22:40:33.572751 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.572738 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh" event={"ID":"d130507f-f6d9-401d-a5e3-1328301efd06","Type":"ContainerDied","Data":"b0919fce0541381c7b7785d07063d28ada86c17a22540c2cdf23d2499378ec64"} Apr 24 22:40:33.574267 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.574246 2572 generic.go:358] "Generic (PLEG): container finished" podID="92a467b0-f71f-4b28-b810-ef6cfad58340" containerID="60b2ec7ac41d0c4c24b8017125fae7f37d356d7daf8acbfa766171042a32e556" exitCode=0 Apr 24 22:40:33.574374 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.574281 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" event={"ID":"92a467b0-f71f-4b28-b810-ef6cfad58340","Type":"ContainerDied","Data":"60b2ec7ac41d0c4c24b8017125fae7f37d356d7daf8acbfa766171042a32e556"} Apr 24 22:40:33.579666 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.579652 2572 scope.go:117] "RemoveContainer" containerID="27317ee53e7dcde9bb7486e993361d8ecf35e52490e0a944c365706e48c01b89" Apr 24 22:40:33.586255 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.586160 2572 scope.go:117] "RemoveContainer" containerID="ff769907b8ad607cef0c3df449b96304ffd6b0d3b57e5bf883ac385c4c7195b8" Apr 24 22:40:33.589103 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.589079 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z"] Apr 24 22:40:33.592390 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.592370 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-d4694-predictor-99989f9d9-jtk6z"] Apr 24 22:40:33.596678 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.596601 2572 scope.go:117] "RemoveContainer" containerID="c3611a5601f633660803495f08fdafbe1b4c55e959209dc7ee58791722c76573" Apr 24 22:40:33.597511 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:40:33.597487 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3611a5601f633660803495f08fdafbe1b4c55e959209dc7ee58791722c76573\": container with ID starting with c3611a5601f633660803495f08fdafbe1b4c55e959209dc7ee58791722c76573 not found: ID does not exist" containerID="c3611a5601f633660803495f08fdafbe1b4c55e959209dc7ee58791722c76573" Apr 24 22:40:33.597618 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.597520 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3611a5601f633660803495f08fdafbe1b4c55e959209dc7ee58791722c76573"} err="failed to get container status \"c3611a5601f633660803495f08fdafbe1b4c55e959209dc7ee58791722c76573\": rpc error: code = NotFound desc = could not find container \"c3611a5601f633660803495f08fdafbe1b4c55e959209dc7ee58791722c76573\": container with ID starting with c3611a5601f633660803495f08fdafbe1b4c55e959209dc7ee58791722c76573 not found: ID does not exist" Apr 24 22:40:33.597618 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.597543 2572 scope.go:117] "RemoveContainer" containerID="27317ee53e7dcde9bb7486e993361d8ecf35e52490e0a944c365706e48c01b89" Apr 24 22:40:33.597836 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:40:33.597810 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27317ee53e7dcde9bb7486e993361d8ecf35e52490e0a944c365706e48c01b89\": container with ID starting with 27317ee53e7dcde9bb7486e993361d8ecf35e52490e0a944c365706e48c01b89 not found: ID does not exist" containerID="27317ee53e7dcde9bb7486e993361d8ecf35e52490e0a944c365706e48c01b89" Apr 24 22:40:33.597904 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.597846 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27317ee53e7dcde9bb7486e993361d8ecf35e52490e0a944c365706e48c01b89"} err="failed to get container status \"27317ee53e7dcde9bb7486e993361d8ecf35e52490e0a944c365706e48c01b89\": rpc error: code = NotFound desc = could not find container \"27317ee53e7dcde9bb7486e993361d8ecf35e52490e0a944c365706e48c01b89\": container with ID starting with 27317ee53e7dcde9bb7486e993361d8ecf35e52490e0a944c365706e48c01b89 not found: ID does not exist" Apr 24 22:40:33.597904 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.597869 2572 scope.go:117] "RemoveContainer" containerID="ff769907b8ad607cef0c3df449b96304ffd6b0d3b57e5bf883ac385c4c7195b8" Apr 24 22:40:33.598183 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:40:33.598155 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff769907b8ad607cef0c3df449b96304ffd6b0d3b57e5bf883ac385c4c7195b8\": container with ID starting with ff769907b8ad607cef0c3df449b96304ffd6b0d3b57e5bf883ac385c4c7195b8 not found: ID does not exist" containerID="ff769907b8ad607cef0c3df449b96304ffd6b0d3b57e5bf883ac385c4c7195b8" Apr 24 22:40:33.598287 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.598186 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff769907b8ad607cef0c3df449b96304ffd6b0d3b57e5bf883ac385c4c7195b8"} err="failed to get container status \"ff769907b8ad607cef0c3df449b96304ffd6b0d3b57e5bf883ac385c4c7195b8\": rpc error: code = NotFound desc = could not find container \"ff769907b8ad607cef0c3df449b96304ffd6b0d3b57e5bf883ac385c4c7195b8\": container with ID starting with ff769907b8ad607cef0c3df449b96304ffd6b0d3b57e5bf883ac385c4c7195b8 not found: ID does not exist" Apr 24 22:40:33.598287 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.598208 2572 scope.go:117] "RemoveContainer" containerID="65f84c9158f2789717fc0dbab3ddf28e8d36549520c1f3c319131b101d96b1d0" Apr 24 22:40:33.607268 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.607248 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh"] Apr 24 22:40:33.609765 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.609745 2572 scope.go:117] "RemoveContainer" containerID="d6b5626cf9fc45f031f68f62b077186cc3b084a28fea6896c25aad37a5f29bf5" Apr 24 22:40:33.611137 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.611120 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-d4694-predictor-776f564644-wm9dh"] Apr 24 22:40:33.624252 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.624053 2572 scope.go:117] "RemoveContainer" containerID="a22d474966db80852e05087584ed17517db7ff7ec7f3e1dbbe51de3194e5d817" Apr 24 22:40:33.641085 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.641067 2572 scope.go:117] "RemoveContainer" containerID="65f84c9158f2789717fc0dbab3ddf28e8d36549520c1f3c319131b101d96b1d0" Apr 24 22:40:33.641349 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:40:33.641332 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65f84c9158f2789717fc0dbab3ddf28e8d36549520c1f3c319131b101d96b1d0\": container with ID starting with 65f84c9158f2789717fc0dbab3ddf28e8d36549520c1f3c319131b101d96b1d0 not found: ID does not exist" containerID="65f84c9158f2789717fc0dbab3ddf28e8d36549520c1f3c319131b101d96b1d0" Apr 24 22:40:33.641404 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.641356 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65f84c9158f2789717fc0dbab3ddf28e8d36549520c1f3c319131b101d96b1d0"} err="failed to get container status \"65f84c9158f2789717fc0dbab3ddf28e8d36549520c1f3c319131b101d96b1d0\": rpc error: code = NotFound desc = could not find container \"65f84c9158f2789717fc0dbab3ddf28e8d36549520c1f3c319131b101d96b1d0\": container with ID starting with 65f84c9158f2789717fc0dbab3ddf28e8d36549520c1f3c319131b101d96b1d0 not found: ID does not exist" Apr 24 22:40:33.641404 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.641372 2572 scope.go:117] "RemoveContainer" containerID="d6b5626cf9fc45f031f68f62b077186cc3b084a28fea6896c25aad37a5f29bf5" Apr 24 22:40:33.641613 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:40:33.641595 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6b5626cf9fc45f031f68f62b077186cc3b084a28fea6896c25aad37a5f29bf5\": container with ID starting with d6b5626cf9fc45f031f68f62b077186cc3b084a28fea6896c25aad37a5f29bf5 not found: ID does not exist" containerID="d6b5626cf9fc45f031f68f62b077186cc3b084a28fea6896c25aad37a5f29bf5" Apr 24 22:40:33.641656 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.641620 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6b5626cf9fc45f031f68f62b077186cc3b084a28fea6896c25aad37a5f29bf5"} err="failed to get container status \"d6b5626cf9fc45f031f68f62b077186cc3b084a28fea6896c25aad37a5f29bf5\": rpc error: code = NotFound desc = could not find container \"d6b5626cf9fc45f031f68f62b077186cc3b084a28fea6896c25aad37a5f29bf5\": container with ID starting with d6b5626cf9fc45f031f68f62b077186cc3b084a28fea6896c25aad37a5f29bf5 not found: ID does not exist" Apr 24 22:40:33.641656 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.641634 2572 scope.go:117] "RemoveContainer" containerID="a22d474966db80852e05087584ed17517db7ff7ec7f3e1dbbe51de3194e5d817" Apr 24 22:40:33.641841 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:40:33.641826 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a22d474966db80852e05087584ed17517db7ff7ec7f3e1dbbe51de3194e5d817\": container with ID starting with a22d474966db80852e05087584ed17517db7ff7ec7f3e1dbbe51de3194e5d817 not found: ID does not exist" containerID="a22d474966db80852e05087584ed17517db7ff7ec7f3e1dbbe51de3194e5d817" Apr 24 22:40:33.641881 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:33.641845 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a22d474966db80852e05087584ed17517db7ff7ec7f3e1dbbe51de3194e5d817"} err="failed to get container status \"a22d474966db80852e05087584ed17517db7ff7ec7f3e1dbbe51de3194e5d817\": rpc error: code = NotFound desc = could not find container \"a22d474966db80852e05087584ed17517db7ff7ec7f3e1dbbe51de3194e5d817\": container with ID starting with a22d474966db80852e05087584ed17517db7ff7ec7f3e1dbbe51de3194e5d817 not found: ID does not exist" Apr 24 22:40:34.578591 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:34.578560 2572 generic.go:358] "Generic (PLEG): container finished" podID="5cae113f-ed3f-4799-98a3-0804da8ad90f" containerID="b2b2bcc44ea747b9bc4f8e736f5a3577379d95d7d88cb086a17c468217244160" exitCode=0 Apr 24 22:40:34.579006 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:34.578635 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" event={"ID":"5cae113f-ed3f-4799-98a3-0804da8ad90f","Type":"ContainerDied","Data":"b2b2bcc44ea747b9bc4f8e736f5a3577379d95d7d88cb086a17c468217244160"} Apr 24 22:40:34.581364 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:34.581328 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" event={"ID":"92a467b0-f71f-4b28-b810-ef6cfad58340","Type":"ContainerStarted","Data":"ec5b1ab525ccee4abaa974746bf0f9639f6d5012580014f36953cf1dded14a6d"} Apr 24 22:40:34.581364 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:34.581361 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" event={"ID":"92a467b0-f71f-4b28-b810-ef6cfad58340","Type":"ContainerStarted","Data":"b19971b1590700f0396b65b3e9145236463d2bc06bde9d3cdbaea3917d8748ef"} Apr 24 22:40:34.581642 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:34.581618 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" Apr 24 22:40:34.581702 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:34.581657 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" Apr 24 22:40:34.582619 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:34.582593 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" podUID="92a467b0-f71f-4b28-b810-ef6cfad58340" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 22:40:34.616544 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:34.616477 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" podStartSLOduration=5.616459566 podStartE2EDuration="5.616459566s" podCreationTimestamp="2026-04-24 22:40:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:40:34.61322201 +0000 UTC m=+641.761527599" watchObservedRunningTime="2026-04-24 22:40:34.616459566 +0000 UTC m=+641.764765119" Apr 24 22:40:35.417542 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:35.417506 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bca9a52c-f7a2-4936-a746-fd24af9e0506" path="/var/lib/kubelet/pods/bca9a52c-f7a2-4936-a746-fd24af9e0506/volumes" Apr 24 22:40:35.417983 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:35.417969 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d130507f-f6d9-401d-a5e3-1328301efd06" path="/var/lib/kubelet/pods/d130507f-f6d9-401d-a5e3-1328301efd06/volumes" Apr 24 22:40:35.587219 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:35.587192 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" event={"ID":"5cae113f-ed3f-4799-98a3-0804da8ad90f","Type":"ContainerStarted","Data":"0fc24b71ad487528a8cd178289d5c035e4e7dbd181cca67f9bd20dae1c182e91"} Apr 24 22:40:35.587219 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:35.587223 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" event={"ID":"5cae113f-ed3f-4799-98a3-0804da8ad90f","Type":"ContainerStarted","Data":"37328229d698464d813494a3300b56a9f454947c21a34a7c0596722d41fdfbc0"} Apr 24 22:40:35.587682 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:35.587531 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" podUID="92a467b0-f71f-4b28-b810-ef6cfad58340" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 22:40:35.606536 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:35.606492 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" podStartSLOduration=6.606467392 podStartE2EDuration="6.606467392s" podCreationTimestamp="2026-04-24 22:40:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:40:35.604943123 +0000 UTC m=+642.753248672" watchObservedRunningTime="2026-04-24 22:40:35.606467392 +0000 UTC m=+642.754772942" Apr 24 22:40:40.588389 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:40.588360 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" Apr 24 22:40:40.588888 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:40.588860 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" Apr 24 22:40:40.589727 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:40.589700 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" podUID="5cae113f-ed3f-4799-98a3-0804da8ad90f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 22:40:40.593628 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:40.593611 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" Apr 24 22:40:40.594058 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:40.594032 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" podUID="92a467b0-f71f-4b28-b810-ef6cfad58340" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 22:40:40.594455 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:40.594439 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" Apr 24 22:40:40.604137 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:40.604114 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" podUID="5cae113f-ed3f-4799-98a3-0804da8ad90f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 22:40:41.613564 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:41.613522 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" podUID="5cae113f-ed3f-4799-98a3-0804da8ad90f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 22:40:50.594781 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:50.594734 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" podUID="92a467b0-f71f-4b28-b810-ef6cfad58340" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 22:40:51.609481 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:40:51.609440 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" podUID="5cae113f-ed3f-4799-98a3-0804da8ad90f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 22:41:00.594005 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:41:00.593971 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" podUID="92a467b0-f71f-4b28-b810-ef6cfad58340" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 22:41:01.609276 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:41:01.609234 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" podUID="5cae113f-ed3f-4799-98a3-0804da8ad90f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 22:41:10.594185 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:41:10.594150 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" podUID="92a467b0-f71f-4b28-b810-ef6cfad58340" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 22:41:11.608698 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:41:11.608652 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" podUID="5cae113f-ed3f-4799-98a3-0804da8ad90f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 22:41:20.594186 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:41:20.594143 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" podUID="92a467b0-f71f-4b28-b810-ef6cfad58340" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 22:41:21.609303 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:41:21.609263 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" podUID="5cae113f-ed3f-4799-98a3-0804da8ad90f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 22:41:30.594997 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:41:30.594958 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" podUID="92a467b0-f71f-4b28-b810-ef6cfad58340" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 22:41:31.609187 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:41:31.609161 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" Apr 24 22:41:40.595252 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:41:40.595143 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" Apr 24 22:42:09.466270 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.466240 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946"] Apr 24 22:42:09.466827 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.466600 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" podUID="5cae113f-ed3f-4799-98a3-0804da8ad90f" containerName="kserve-container" containerID="cri-o://37328229d698464d813494a3300b56a9f454947c21a34a7c0596722d41fdfbc0" gracePeriod=30 Apr 24 22:42:09.466827 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.466654 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" podUID="5cae113f-ed3f-4799-98a3-0804da8ad90f" containerName="kube-rbac-proxy" containerID="cri-o://0fc24b71ad487528a8cd178289d5c035e4e7dbd181cca67f9bd20dae1c182e91" gracePeriod=30 Apr 24 22:42:09.529790 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.529761 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c"] Apr 24 22:42:09.530108 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.530067 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" podUID="92a467b0-f71f-4b28-b810-ef6cfad58340" containerName="kserve-container" containerID="cri-o://b19971b1590700f0396b65b3e9145236463d2bc06bde9d3cdbaea3917d8748ef" gracePeriod=30 Apr 24 22:42:09.530187 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.530145 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" podUID="92a467b0-f71f-4b28-b810-ef6cfad58340" containerName="kube-rbac-proxy" containerID="cri-o://ec5b1ab525ccee4abaa974746bf0f9639f6d5012580014f36953cf1dded14a6d" gracePeriod=30 Apr 24 22:42:09.555429 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.555406 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9"] Apr 24 22:42:09.555777 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.555761 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d130507f-f6d9-401d-a5e3-1328301efd06" containerName="storage-initializer" Apr 24 22:42:09.555855 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.555779 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d130507f-f6d9-401d-a5e3-1328301efd06" containerName="storage-initializer" Apr 24 22:42:09.555855 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.555791 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bca9a52c-f7a2-4936-a746-fd24af9e0506" containerName="storage-initializer" Apr 24 22:42:09.555855 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.555798 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="bca9a52c-f7a2-4936-a746-fd24af9e0506" containerName="storage-initializer" Apr 24 22:42:09.555855 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.555808 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bca9a52c-f7a2-4936-a746-fd24af9e0506" containerName="kube-rbac-proxy" Apr 24 22:42:09.555855 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.555817 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="bca9a52c-f7a2-4936-a746-fd24af9e0506" containerName="kube-rbac-proxy" Apr 24 22:42:09.555855 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.555839 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d130507f-f6d9-401d-a5e3-1328301efd06" containerName="kube-rbac-proxy" Apr 24 22:42:09.555855 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.555847 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d130507f-f6d9-401d-a5e3-1328301efd06" containerName="kube-rbac-proxy" Apr 24 22:42:09.556209 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.555865 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bca9a52c-f7a2-4936-a746-fd24af9e0506" containerName="kserve-container" Apr 24 22:42:09.556209 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.555873 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="bca9a52c-f7a2-4936-a746-fd24af9e0506" containerName="kserve-container" Apr 24 22:42:09.556209 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.555888 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d130507f-f6d9-401d-a5e3-1328301efd06" containerName="kserve-container" Apr 24 22:42:09.556209 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.555898 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d130507f-f6d9-401d-a5e3-1328301efd06" containerName="kserve-container" Apr 24 22:42:09.556209 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.555973 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d130507f-f6d9-401d-a5e3-1328301efd06" containerName="kube-rbac-proxy" Apr 24 22:42:09.556209 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.555988 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d130507f-f6d9-401d-a5e3-1328301efd06" containerName="kserve-container" Apr 24 22:42:09.556209 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.555996 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="bca9a52c-f7a2-4936-a746-fd24af9e0506" containerName="kserve-container" Apr 24 22:42:09.556209 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.556025 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="bca9a52c-f7a2-4936-a746-fd24af9e0506" containerName="kube-rbac-proxy" Apr 24 22:42:09.559050 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.559006 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9" Apr 24 22:42:09.561660 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.561639 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-d7349-kube-rbac-proxy-sar-config\"" Apr 24 22:42:09.561756 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.561692 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-d7349-predictor-serving-cert\"" Apr 24 22:42:09.566779 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.566760 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9"] Apr 24 22:42:09.581449 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.581429 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vd76\" (UniqueName: \"kubernetes.io/projected/7cc35065-8b02-4a65-b7c4-04c94f9ff71b-kube-api-access-6vd76\") pod \"message-dumper-raw-d7349-predictor-77567676df-jcnx9\" (UID: \"7cc35065-8b02-4a65-b7c4-04c94f9ff71b\") " pod="kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9" Apr 24 22:42:09.581535 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.581488 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-raw-d7349-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7cc35065-8b02-4a65-b7c4-04c94f9ff71b-message-dumper-raw-d7349-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-d7349-predictor-77567676df-jcnx9\" (UID: \"7cc35065-8b02-4a65-b7c4-04c94f9ff71b\") " pod="kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9" Apr 24 22:42:09.581588 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.581538 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7cc35065-8b02-4a65-b7c4-04c94f9ff71b-proxy-tls\") pod \"message-dumper-raw-d7349-predictor-77567676df-jcnx9\" (UID: \"7cc35065-8b02-4a65-b7c4-04c94f9ff71b\") " pod="kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9" Apr 24 22:42:09.682274 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.682250 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7cc35065-8b02-4a65-b7c4-04c94f9ff71b-proxy-tls\") pod \"message-dumper-raw-d7349-predictor-77567676df-jcnx9\" (UID: \"7cc35065-8b02-4a65-b7c4-04c94f9ff71b\") " pod="kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9" Apr 24 22:42:09.682358 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.682286 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vd76\" (UniqueName: \"kubernetes.io/projected/7cc35065-8b02-4a65-b7c4-04c94f9ff71b-kube-api-access-6vd76\") pod \"message-dumper-raw-d7349-predictor-77567676df-jcnx9\" (UID: \"7cc35065-8b02-4a65-b7c4-04c94f9ff71b\") " pod="kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9" Apr 24 22:42:09.682358 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.682348 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-raw-d7349-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7cc35065-8b02-4a65-b7c4-04c94f9ff71b-message-dumper-raw-d7349-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-d7349-predictor-77567676df-jcnx9\" (UID: \"7cc35065-8b02-4a65-b7c4-04c94f9ff71b\") " pod="kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9" Apr 24 22:42:09.682427 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:42:09.682395 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-serving-cert: secret "message-dumper-raw-d7349-predictor-serving-cert" not found Apr 24 22:42:09.682469 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:42:09.682451 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cc35065-8b02-4a65-b7c4-04c94f9ff71b-proxy-tls podName:7cc35065-8b02-4a65-b7c4-04c94f9ff71b nodeName:}" failed. No retries permitted until 2026-04-24 22:42:10.182431446 +0000 UTC m=+737.330736980 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/7cc35065-8b02-4a65-b7c4-04c94f9ff71b-proxy-tls") pod "message-dumper-raw-d7349-predictor-77567676df-jcnx9" (UID: "7cc35065-8b02-4a65-b7c4-04c94f9ff71b") : secret "message-dumper-raw-d7349-predictor-serving-cert" not found Apr 24 22:42:09.682900 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.682883 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-raw-d7349-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7cc35065-8b02-4a65-b7c4-04c94f9ff71b-message-dumper-raw-d7349-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-d7349-predictor-77567676df-jcnx9\" (UID: \"7cc35065-8b02-4a65-b7c4-04c94f9ff71b\") " pod="kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9" Apr 24 22:42:09.692473 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.692447 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vd76\" (UniqueName: \"kubernetes.io/projected/7cc35065-8b02-4a65-b7c4-04c94f9ff71b-kube-api-access-6vd76\") pod \"message-dumper-raw-d7349-predictor-77567676df-jcnx9\" (UID: \"7cc35065-8b02-4a65-b7c4-04c94f9ff71b\") " pod="kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9" Apr 24 22:42:09.884326 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.884288 2572 generic.go:358] "Generic (PLEG): container finished" podID="92a467b0-f71f-4b28-b810-ef6cfad58340" containerID="ec5b1ab525ccee4abaa974746bf0f9639f6d5012580014f36953cf1dded14a6d" exitCode=2 Apr 24 22:42:09.884472 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.884349 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" event={"ID":"92a467b0-f71f-4b28-b810-ef6cfad58340","Type":"ContainerDied","Data":"ec5b1ab525ccee4abaa974746bf0f9639f6d5012580014f36953cf1dded14a6d"} Apr 24 22:42:09.886083 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.886061 2572 generic.go:358] "Generic (PLEG): container finished" podID="5cae113f-ed3f-4799-98a3-0804da8ad90f" containerID="0fc24b71ad487528a8cd178289d5c035e4e7dbd181cca67f9bd20dae1c182e91" exitCode=2 Apr 24 22:42:09.886226 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:09.886109 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" event={"ID":"5cae113f-ed3f-4799-98a3-0804da8ad90f","Type":"ContainerDied","Data":"0fc24b71ad487528a8cd178289d5c035e4e7dbd181cca67f9bd20dae1c182e91"} Apr 24 22:42:10.185718 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:10.185646 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7cc35065-8b02-4a65-b7c4-04c94f9ff71b-proxy-tls\") pod \"message-dumper-raw-d7349-predictor-77567676df-jcnx9\" (UID: \"7cc35065-8b02-4a65-b7c4-04c94f9ff71b\") " pod="kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9" Apr 24 22:42:10.187875 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:10.187853 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7cc35065-8b02-4a65-b7c4-04c94f9ff71b-proxy-tls\") pod \"message-dumper-raw-d7349-predictor-77567676df-jcnx9\" (UID: \"7cc35065-8b02-4a65-b7c4-04c94f9ff71b\") " pod="kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9" Apr 24 22:42:10.470137 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:10.470058 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9" Apr 24 22:42:10.585467 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:10.585443 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9"] Apr 24 22:42:10.587949 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:10.587784 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" podUID="92a467b0-f71f-4b28-b810-ef6cfad58340" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 24 22:42:10.588068 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:42:10.587994 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cc35065_8b02_4a65_b7c4_04c94f9ff71b.slice/crio-7dcedff4fa4bd824ce8351629111b327d47bf0c56511ac9a9211ab8c95419087 WatchSource:0}: Error finding container 7dcedff4fa4bd824ce8351629111b327d47bf0c56511ac9a9211ab8c95419087: Status 404 returned error can't find the container with id 7dcedff4fa4bd824ce8351629111b327d47bf0c56511ac9a9211ab8c95419087 Apr 24 22:42:10.589621 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:10.589604 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:42:10.594366 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:10.594340 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" podUID="92a467b0-f71f-4b28-b810-ef6cfad58340" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 22:42:10.594629 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:10.594605 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" podUID="5cae113f-ed3f-4799-98a3-0804da8ad90f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.29:8643/healthz\": dial tcp 10.134.0.29:8643: connect: connection refused" Apr 24 22:42:10.890361 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:10.890326 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9" event={"ID":"7cc35065-8b02-4a65-b7c4-04c94f9ff71b","Type":"ContainerStarted","Data":"7dcedff4fa4bd824ce8351629111b327d47bf0c56511ac9a9211ab8c95419087"} Apr 24 22:42:11.609266 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:11.609218 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" podUID="5cae113f-ed3f-4799-98a3-0804da8ad90f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 22:42:11.895220 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:11.895146 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9" event={"ID":"7cc35065-8b02-4a65-b7c4-04c94f9ff71b","Type":"ContainerStarted","Data":"0817db4500aa8a335e9a7d3706d1bc5b908adff4826bd220afa6a828687470d6"} Apr 24 22:42:11.895220 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:11.895183 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9" event={"ID":"7cc35065-8b02-4a65-b7c4-04c94f9ff71b","Type":"ContainerStarted","Data":"e97527bb69f6c8c068afdc2f3b03e6dcee8d51fbddb3f5d83e8d1d5f3720161b"} Apr 24 22:42:11.895414 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:11.895393 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9" Apr 24 22:42:11.895535 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:11.895522 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9" Apr 24 22:42:11.897178 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:11.897160 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9" Apr 24 22:42:11.913879 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:11.913835 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9" podStartSLOduration=1.8588698099999998 podStartE2EDuration="2.91382457s" podCreationTimestamp="2026-04-24 22:42:09 +0000 UTC" firstStartedPulling="2026-04-24 22:42:10.589724003 +0000 UTC m=+737.738029530" lastFinishedPulling="2026-04-24 22:42:11.644678765 +0000 UTC m=+738.792984290" observedRunningTime="2026-04-24 22:42:11.911492905 +0000 UTC m=+739.059798453" watchObservedRunningTime="2026-04-24 22:42:11.91382457 +0000 UTC m=+739.062130187" Apr 24 22:42:12.704595 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.704574 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" Apr 24 22:42:12.810029 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.809947 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cae113f-ed3f-4799-98a3-0804da8ad90f-proxy-tls\") pod \"5cae113f-ed3f-4799-98a3-0804da8ad90f\" (UID: \"5cae113f-ed3f-4799-98a3-0804da8ad90f\") " Apr 24 22:42:12.810029 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.809985 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5cae113f-ed3f-4799-98a3-0804da8ad90f-isvc-xgboost-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config\") pod \"5cae113f-ed3f-4799-98a3-0804da8ad90f\" (UID: \"5cae113f-ed3f-4799-98a3-0804da8ad90f\") " Apr 24 22:42:12.810215 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.810042 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98wp7\" (UniqueName: \"kubernetes.io/projected/5cae113f-ed3f-4799-98a3-0804da8ad90f-kube-api-access-98wp7\") pod \"5cae113f-ed3f-4799-98a3-0804da8ad90f\" (UID: \"5cae113f-ed3f-4799-98a3-0804da8ad90f\") " Apr 24 22:42:12.810215 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.810086 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5cae113f-ed3f-4799-98a3-0804da8ad90f-kserve-provision-location\") pod \"5cae113f-ed3f-4799-98a3-0804da8ad90f\" (UID: \"5cae113f-ed3f-4799-98a3-0804da8ad90f\") " Apr 24 22:42:12.810394 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.810370 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cae113f-ed3f-4799-98a3-0804da8ad90f-isvc-xgboost-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config") pod "5cae113f-ed3f-4799-98a3-0804da8ad90f" (UID: "5cae113f-ed3f-4799-98a3-0804da8ad90f"). InnerVolumeSpecName "isvc-xgboost-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:42:12.810463 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.810412 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cae113f-ed3f-4799-98a3-0804da8ad90f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5cae113f-ed3f-4799-98a3-0804da8ad90f" (UID: "5cae113f-ed3f-4799-98a3-0804da8ad90f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:42:12.812071 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.812053 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cae113f-ed3f-4799-98a3-0804da8ad90f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5cae113f-ed3f-4799-98a3-0804da8ad90f" (UID: "5cae113f-ed3f-4799-98a3-0804da8ad90f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:42:12.812158 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.812143 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cae113f-ed3f-4799-98a3-0804da8ad90f-kube-api-access-98wp7" (OuterVolumeSpecName: "kube-api-access-98wp7") pod "5cae113f-ed3f-4799-98a3-0804da8ad90f" (UID: "5cae113f-ed3f-4799-98a3-0804da8ad90f"). InnerVolumeSpecName "kube-api-access-98wp7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:42:12.899478 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.899454 2572 generic.go:358] "Generic (PLEG): container finished" podID="5cae113f-ed3f-4799-98a3-0804da8ad90f" containerID="37328229d698464d813494a3300b56a9f454947c21a34a7c0596722d41fdfbc0" exitCode=0 Apr 24 22:42:12.899578 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.899538 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" Apr 24 22:42:12.899626 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.899534 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" event={"ID":"5cae113f-ed3f-4799-98a3-0804da8ad90f","Type":"ContainerDied","Data":"37328229d698464d813494a3300b56a9f454947c21a34a7c0596722d41fdfbc0"} Apr 24 22:42:12.899662 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.899642 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946" event={"ID":"5cae113f-ed3f-4799-98a3-0804da8ad90f","Type":"ContainerDied","Data":"771f27871ae0aac2ae75e7a52f82eab3a36d1f0a744dc3aa6ff8345e18b81dda"} Apr 24 22:42:12.899662 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.899658 2572 scope.go:117] "RemoveContainer" containerID="0fc24b71ad487528a8cd178289d5c035e4e7dbd181cca67f9bd20dae1c182e91" Apr 24 22:42:12.907443 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.907415 2572 scope.go:117] "RemoveContainer" containerID="37328229d698464d813494a3300b56a9f454947c21a34a7c0596722d41fdfbc0" Apr 24 22:42:12.910621 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.910604 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5cae113f-ed3f-4799-98a3-0804da8ad90f-kserve-provision-location\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:42:12.910694 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.910624 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cae113f-ed3f-4799-98a3-0804da8ad90f-proxy-tls\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:42:12.910694 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.910635 2572 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5cae113f-ed3f-4799-98a3-0804da8ad90f-isvc-xgboost-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:42:12.910694 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.910646 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-98wp7\" (UniqueName: \"kubernetes.io/projected/5cae113f-ed3f-4799-98a3-0804da8ad90f-kube-api-access-98wp7\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:42:12.914021 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.913992 2572 scope.go:117] "RemoveContainer" containerID="b2b2bcc44ea747b9bc4f8e736f5a3577379d95d7d88cb086a17c468217244160" Apr 24 22:42:12.920252 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.920233 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946"] Apr 24 22:42:12.920784 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.920772 2572 scope.go:117] "RemoveContainer" containerID="0fc24b71ad487528a8cd178289d5c035e4e7dbd181cca67f9bd20dae1c182e91" Apr 24 22:42:12.921026 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:42:12.920998 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fc24b71ad487528a8cd178289d5c035e4e7dbd181cca67f9bd20dae1c182e91\": container with ID starting with 0fc24b71ad487528a8cd178289d5c035e4e7dbd181cca67f9bd20dae1c182e91 not found: ID does not exist" containerID="0fc24b71ad487528a8cd178289d5c035e4e7dbd181cca67f9bd20dae1c182e91" Apr 24 22:42:12.921089 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.921035 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fc24b71ad487528a8cd178289d5c035e4e7dbd181cca67f9bd20dae1c182e91"} err="failed to get container status \"0fc24b71ad487528a8cd178289d5c035e4e7dbd181cca67f9bd20dae1c182e91\": rpc error: code = NotFound desc = could not find container \"0fc24b71ad487528a8cd178289d5c035e4e7dbd181cca67f9bd20dae1c182e91\": container with ID starting with 0fc24b71ad487528a8cd178289d5c035e4e7dbd181cca67f9bd20dae1c182e91 not found: ID does not exist" Apr 24 22:42:12.921089 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.921050 2572 scope.go:117] "RemoveContainer" containerID="37328229d698464d813494a3300b56a9f454947c21a34a7c0596722d41fdfbc0" Apr 24 22:42:12.921244 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:42:12.921226 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37328229d698464d813494a3300b56a9f454947c21a34a7c0596722d41fdfbc0\": container with ID starting with 37328229d698464d813494a3300b56a9f454947c21a34a7c0596722d41fdfbc0 not found: ID does not exist" containerID="37328229d698464d813494a3300b56a9f454947c21a34a7c0596722d41fdfbc0" Apr 24 22:42:12.921280 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.921249 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37328229d698464d813494a3300b56a9f454947c21a34a7c0596722d41fdfbc0"} err="failed to get container status \"37328229d698464d813494a3300b56a9f454947c21a34a7c0596722d41fdfbc0\": rpc error: code = NotFound desc = could not find container \"37328229d698464d813494a3300b56a9f454947c21a34a7c0596722d41fdfbc0\": container with ID starting with 37328229d698464d813494a3300b56a9f454947c21a34a7c0596722d41fdfbc0 not found: ID does not exist" Apr 24 22:42:12.921280 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.921262 2572 scope.go:117] "RemoveContainer" containerID="b2b2bcc44ea747b9bc4f8e736f5a3577379d95d7d88cb086a17c468217244160" Apr 24 22:42:12.921459 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:42:12.921446 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2b2bcc44ea747b9bc4f8e736f5a3577379d95d7d88cb086a17c468217244160\": container with ID starting with b2b2bcc44ea747b9bc4f8e736f5a3577379d95d7d88cb086a17c468217244160 not found: ID does not exist" containerID="b2b2bcc44ea747b9bc4f8e736f5a3577379d95d7d88cb086a17c468217244160" Apr 24 22:42:12.921504 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.921464 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2b2bcc44ea747b9bc4f8e736f5a3577379d95d7d88cb086a17c468217244160"} err="failed to get container status \"b2b2bcc44ea747b9bc4f8e736f5a3577379d95d7d88cb086a17c468217244160\": rpc error: code = NotFound desc = could not find container \"b2b2bcc44ea747b9bc4f8e736f5a3577379d95d7d88cb086a17c468217244160\": container with ID starting with b2b2bcc44ea747b9bc4f8e736f5a3577379d95d7d88cb086a17c468217244160 not found: ID does not exist" Apr 24 22:42:12.926157 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:12.926137 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-133b5-predictor-7cf69f4885-67946"] Apr 24 22:42:13.418797 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.418772 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cae113f-ed3f-4799-98a3-0804da8ad90f" path="/var/lib/kubelet/pods/5cae113f-ed3f-4799-98a3-0804da8ad90f/volumes" Apr 24 22:42:13.486651 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.486631 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" Apr 24 22:42:13.515252 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.515221 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/92a467b0-f71f-4b28-b810-ef6cfad58340-proxy-tls\") pod \"92a467b0-f71f-4b28-b810-ef6cfad58340\" (UID: \"92a467b0-f71f-4b28-b810-ef6cfad58340\") " Apr 24 22:42:13.515352 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.515309 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92a467b0-f71f-4b28-b810-ef6cfad58340-kserve-provision-location\") pod \"92a467b0-f71f-4b28-b810-ef6cfad58340\" (UID: \"92a467b0-f71f-4b28-b810-ef6cfad58340\") " Apr 24 22:42:13.515352 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.515347 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cv4wk\" (UniqueName: \"kubernetes.io/projected/92a467b0-f71f-4b28-b810-ef6cfad58340-kube-api-access-cv4wk\") pod \"92a467b0-f71f-4b28-b810-ef6cfad58340\" (UID: \"92a467b0-f71f-4b28-b810-ef6cfad58340\") " Apr 24 22:42:13.515468 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.515453 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/92a467b0-f71f-4b28-b810-ef6cfad58340-isvc-sklearn-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config\") pod \"92a467b0-f71f-4b28-b810-ef6cfad58340\" (UID: \"92a467b0-f71f-4b28-b810-ef6cfad58340\") " Apr 24 22:42:13.515585 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.515558 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92a467b0-f71f-4b28-b810-ef6cfad58340-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "92a467b0-f71f-4b28-b810-ef6cfad58340" (UID: "92a467b0-f71f-4b28-b810-ef6cfad58340"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:42:13.515709 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.515687 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92a467b0-f71f-4b28-b810-ef6cfad58340-kserve-provision-location\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:42:13.515814 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.515773 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a467b0-f71f-4b28-b810-ef6cfad58340-isvc-sklearn-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config") pod "92a467b0-f71f-4b28-b810-ef6cfad58340" (UID: "92a467b0-f71f-4b28-b810-ef6cfad58340"). InnerVolumeSpecName "isvc-sklearn-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:42:13.517297 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.517276 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92a467b0-f71f-4b28-b810-ef6cfad58340-kube-api-access-cv4wk" (OuterVolumeSpecName: "kube-api-access-cv4wk") pod "92a467b0-f71f-4b28-b810-ef6cfad58340" (UID: "92a467b0-f71f-4b28-b810-ef6cfad58340"). InnerVolumeSpecName "kube-api-access-cv4wk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:42:13.517372 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.517316 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a467b0-f71f-4b28-b810-ef6cfad58340-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "92a467b0-f71f-4b28-b810-ef6cfad58340" (UID: "92a467b0-f71f-4b28-b810-ef6cfad58340"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:42:13.616953 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.616896 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cv4wk\" (UniqueName: \"kubernetes.io/projected/92a467b0-f71f-4b28-b810-ef6cfad58340-kube-api-access-cv4wk\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:42:13.616953 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.616920 2572 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/92a467b0-f71f-4b28-b810-ef6cfad58340-isvc-sklearn-graph-raw-hpa-133b5-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:42:13.616953 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.616936 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/92a467b0-f71f-4b28-b810-ef6cfad58340-proxy-tls\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:42:13.906003 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.905952 2572 generic.go:358] "Generic (PLEG): container finished" podID="92a467b0-f71f-4b28-b810-ef6cfad58340" containerID="b19971b1590700f0396b65b3e9145236463d2bc06bde9d3cdbaea3917d8748ef" exitCode=0 Apr 24 22:42:13.906330 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.906037 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" event={"ID":"92a467b0-f71f-4b28-b810-ef6cfad58340","Type":"ContainerDied","Data":"b19971b1590700f0396b65b3e9145236463d2bc06bde9d3cdbaea3917d8748ef"} Apr 24 22:42:13.906330 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.906056 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" Apr 24 22:42:13.906330 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.906076 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c" event={"ID":"92a467b0-f71f-4b28-b810-ef6cfad58340","Type":"ContainerDied","Data":"be2ce0bf82a7492c8831b1e0eac288089b5fbbb9cf7b5a467770d2bdd055f536"} Apr 24 22:42:13.906330 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.906097 2572 scope.go:117] "RemoveContainer" containerID="ec5b1ab525ccee4abaa974746bf0f9639f6d5012580014f36953cf1dded14a6d" Apr 24 22:42:13.914762 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.914746 2572 scope.go:117] "RemoveContainer" containerID="b19971b1590700f0396b65b3e9145236463d2bc06bde9d3cdbaea3917d8748ef" Apr 24 22:42:13.921675 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.921658 2572 scope.go:117] "RemoveContainer" containerID="60b2ec7ac41d0c4c24b8017125fae7f37d356d7daf8acbfa766171042a32e556" Apr 24 22:42:13.928471 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.928443 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c"] Apr 24 22:42:13.929160 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.929140 2572 scope.go:117] "RemoveContainer" containerID="ec5b1ab525ccee4abaa974746bf0f9639f6d5012580014f36953cf1dded14a6d" Apr 24 22:42:13.929407 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:42:13.929389 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec5b1ab525ccee4abaa974746bf0f9639f6d5012580014f36953cf1dded14a6d\": container with ID starting with ec5b1ab525ccee4abaa974746bf0f9639f6d5012580014f36953cf1dded14a6d not found: ID does not exist" containerID="ec5b1ab525ccee4abaa974746bf0f9639f6d5012580014f36953cf1dded14a6d" Apr 24 22:42:13.929474 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.929412 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec5b1ab525ccee4abaa974746bf0f9639f6d5012580014f36953cf1dded14a6d"} err="failed to get container status \"ec5b1ab525ccee4abaa974746bf0f9639f6d5012580014f36953cf1dded14a6d\": rpc error: code = NotFound desc = could not find container \"ec5b1ab525ccee4abaa974746bf0f9639f6d5012580014f36953cf1dded14a6d\": container with ID starting with ec5b1ab525ccee4abaa974746bf0f9639f6d5012580014f36953cf1dded14a6d not found: ID does not exist" Apr 24 22:42:13.929474 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.929430 2572 scope.go:117] "RemoveContainer" containerID="b19971b1590700f0396b65b3e9145236463d2bc06bde9d3cdbaea3917d8748ef" Apr 24 22:42:13.929697 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:42:13.929670 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b19971b1590700f0396b65b3e9145236463d2bc06bde9d3cdbaea3917d8748ef\": container with ID starting with b19971b1590700f0396b65b3e9145236463d2bc06bde9d3cdbaea3917d8748ef not found: ID does not exist" containerID="b19971b1590700f0396b65b3e9145236463d2bc06bde9d3cdbaea3917d8748ef" Apr 24 22:42:13.929739 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.929703 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b19971b1590700f0396b65b3e9145236463d2bc06bde9d3cdbaea3917d8748ef"} err="failed to get container status \"b19971b1590700f0396b65b3e9145236463d2bc06bde9d3cdbaea3917d8748ef\": rpc error: code = NotFound desc = could not find container \"b19971b1590700f0396b65b3e9145236463d2bc06bde9d3cdbaea3917d8748ef\": container with ID starting with b19971b1590700f0396b65b3e9145236463d2bc06bde9d3cdbaea3917d8748ef not found: ID does not exist" Apr 24 22:42:13.929739 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.929720 2572 scope.go:117] "RemoveContainer" containerID="60b2ec7ac41d0c4c24b8017125fae7f37d356d7daf8acbfa766171042a32e556" Apr 24 22:42:13.929965 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:42:13.929951 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60b2ec7ac41d0c4c24b8017125fae7f37d356d7daf8acbfa766171042a32e556\": container with ID starting with 60b2ec7ac41d0c4c24b8017125fae7f37d356d7daf8acbfa766171042a32e556 not found: ID does not exist" containerID="60b2ec7ac41d0c4c24b8017125fae7f37d356d7daf8acbfa766171042a32e556" Apr 24 22:42:13.929997 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.929969 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60b2ec7ac41d0c4c24b8017125fae7f37d356d7daf8acbfa766171042a32e556"} err="failed to get container status \"60b2ec7ac41d0c4c24b8017125fae7f37d356d7daf8acbfa766171042a32e556\": rpc error: code = NotFound desc = could not find container \"60b2ec7ac41d0c4c24b8017125fae7f37d356d7daf8acbfa766171042a32e556\": container with ID starting with 60b2ec7ac41d0c4c24b8017125fae7f37d356d7daf8acbfa766171042a32e556 not found: ID does not exist" Apr 24 22:42:13.933902 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:13.933882 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-133b5-predictor-7766867b4-9wk9c"] Apr 24 22:42:15.424471 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:15.420390 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92a467b0-f71f-4b28-b810-ef6cfad58340" path="/var/lib/kubelet/pods/92a467b0-f71f-4b28-b810-ef6cfad58340/volumes" Apr 24 22:42:18.910349 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:18.910292 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9" Apr 24 22:42:19.577084 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.577046 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh"] Apr 24 22:42:19.577404 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.577391 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5cae113f-ed3f-4799-98a3-0804da8ad90f" containerName="kserve-container" Apr 24 22:42:19.577452 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.577405 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cae113f-ed3f-4799-98a3-0804da8ad90f" containerName="kserve-container" Apr 24 22:42:19.577452 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.577414 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92a467b0-f71f-4b28-b810-ef6cfad58340" containerName="storage-initializer" Apr 24 22:42:19.577452 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.577420 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a467b0-f71f-4b28-b810-ef6cfad58340" containerName="storage-initializer" Apr 24 22:42:19.577452 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.577431 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5cae113f-ed3f-4799-98a3-0804da8ad90f" containerName="storage-initializer" Apr 24 22:42:19.577452 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.577437 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cae113f-ed3f-4799-98a3-0804da8ad90f" containerName="storage-initializer" Apr 24 22:42:19.577595 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.577460 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92a467b0-f71f-4b28-b810-ef6cfad58340" containerName="kube-rbac-proxy" Apr 24 22:42:19.577595 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.577466 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a467b0-f71f-4b28-b810-ef6cfad58340" containerName="kube-rbac-proxy" Apr 24 22:42:19.577595 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.577473 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5cae113f-ed3f-4799-98a3-0804da8ad90f" containerName="kube-rbac-proxy" Apr 24 22:42:19.577595 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.577478 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cae113f-ed3f-4799-98a3-0804da8ad90f" containerName="kube-rbac-proxy" Apr 24 22:42:19.577595 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.577488 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92a467b0-f71f-4b28-b810-ef6cfad58340" containerName="kserve-container" Apr 24 22:42:19.577595 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.577493 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a467b0-f71f-4b28-b810-ef6cfad58340" containerName="kserve-container" Apr 24 22:42:19.577595 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.577547 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="92a467b0-f71f-4b28-b810-ef6cfad58340" containerName="kserve-container" Apr 24 22:42:19.577595 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.577554 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="5cae113f-ed3f-4799-98a3-0804da8ad90f" containerName="kserve-container" Apr 24 22:42:19.577595 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.577561 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="5cae113f-ed3f-4799-98a3-0804da8ad90f" containerName="kube-rbac-proxy" Apr 24 22:42:19.577595 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.577569 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="92a467b0-f71f-4b28-b810-ef6cfad58340" containerName="kube-rbac-proxy" Apr 24 22:42:19.582588 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.582571 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" Apr 24 22:42:19.585093 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.585063 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-d7349-predictor-serving-cert\"" Apr 24 22:42:19.585093 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.585069 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-d7349-kube-rbac-proxy-sar-config\"" Apr 24 22:42:19.588388 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.588365 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh"] Apr 24 22:42:19.655298 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.655275 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/528c314a-1af7-4b7b-9bf8-684f8decd693-proxy-tls\") pod \"isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh\" (UID: \"528c314a-1af7-4b7b-9bf8-684f8decd693\") " pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" Apr 24 22:42:19.655392 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.655305 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/528c314a-1af7-4b7b-9bf8-684f8decd693-kserve-provision-location\") pod \"isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh\" (UID: \"528c314a-1af7-4b7b-9bf8-684f8decd693\") " pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" Apr 24 22:42:19.655392 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.655324 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6hln\" (UniqueName: \"kubernetes.io/projected/528c314a-1af7-4b7b-9bf8-684f8decd693-kube-api-access-r6hln\") pod \"isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh\" (UID: \"528c314a-1af7-4b7b-9bf8-684f8decd693\") " pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" Apr 24 22:42:19.655466 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.655445 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-raw-d7349-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/528c314a-1af7-4b7b-9bf8-684f8decd693-isvc-logger-raw-d7349-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh\" (UID: \"528c314a-1af7-4b7b-9bf8-684f8decd693\") " pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" Apr 24 22:42:19.755747 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.755720 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/528c314a-1af7-4b7b-9bf8-684f8decd693-kserve-provision-location\") pod \"isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh\" (UID: \"528c314a-1af7-4b7b-9bf8-684f8decd693\") " pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" Apr 24 22:42:19.755847 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.755750 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r6hln\" (UniqueName: \"kubernetes.io/projected/528c314a-1af7-4b7b-9bf8-684f8decd693-kube-api-access-r6hln\") pod \"isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh\" (UID: \"528c314a-1af7-4b7b-9bf8-684f8decd693\") " pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" Apr 24 22:42:19.755847 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.755799 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-raw-d7349-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/528c314a-1af7-4b7b-9bf8-684f8decd693-isvc-logger-raw-d7349-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh\" (UID: \"528c314a-1af7-4b7b-9bf8-684f8decd693\") " pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" Apr 24 22:42:19.755917 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.755845 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/528c314a-1af7-4b7b-9bf8-684f8decd693-proxy-tls\") pod \"isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh\" (UID: \"528c314a-1af7-4b7b-9bf8-684f8decd693\") " pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" Apr 24 22:42:19.756134 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.756113 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/528c314a-1af7-4b7b-9bf8-684f8decd693-kserve-provision-location\") pod \"isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh\" (UID: \"528c314a-1af7-4b7b-9bf8-684f8decd693\") " pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" Apr 24 22:42:19.756503 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.756486 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-raw-d7349-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/528c314a-1af7-4b7b-9bf8-684f8decd693-isvc-logger-raw-d7349-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh\" (UID: \"528c314a-1af7-4b7b-9bf8-684f8decd693\") " pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" Apr 24 22:42:19.758073 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.758057 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/528c314a-1af7-4b7b-9bf8-684f8decd693-proxy-tls\") pod \"isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh\" (UID: \"528c314a-1af7-4b7b-9bf8-684f8decd693\") " pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" Apr 24 22:42:19.763448 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.763430 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6hln\" (UniqueName: \"kubernetes.io/projected/528c314a-1af7-4b7b-9bf8-684f8decd693-kube-api-access-r6hln\") pod \"isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh\" (UID: \"528c314a-1af7-4b7b-9bf8-684f8decd693\") " pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" Apr 24 22:42:19.893988 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:19.893925 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" Apr 24 22:42:20.011635 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:20.011609 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh"] Apr 24 22:42:20.013471 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:42:20.013445 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod528c314a_1af7_4b7b_9bf8_684f8decd693.slice/crio-9f3805d0d44991b82c8ba4e57dc7e7ee988f3e371969f137a8b569e4b0c5b7a5 WatchSource:0}: Error finding container 9f3805d0d44991b82c8ba4e57dc7e7ee988f3e371969f137a8b569e4b0c5b7a5: Status 404 returned error can't find the container with id 9f3805d0d44991b82c8ba4e57dc7e7ee988f3e371969f137a8b569e4b0c5b7a5 Apr 24 22:42:20.932693 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:20.932649 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" event={"ID":"528c314a-1af7-4b7b-9bf8-684f8decd693","Type":"ContainerStarted","Data":"f8917efdef280fbafff5fdb9e12aca86bf80f62f4c44e47999714037435c5abb"} Apr 24 22:42:20.932693 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:20.932692 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" event={"ID":"528c314a-1af7-4b7b-9bf8-684f8decd693","Type":"ContainerStarted","Data":"9f3805d0d44991b82c8ba4e57dc7e7ee988f3e371969f137a8b569e4b0c5b7a5"} Apr 24 22:42:23.947846 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:23.947777 2572 generic.go:358] "Generic (PLEG): container finished" podID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerID="f8917efdef280fbafff5fdb9e12aca86bf80f62f4c44e47999714037435c5abb" exitCode=0 Apr 24 22:42:23.947846 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:23.947839 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" event={"ID":"528c314a-1af7-4b7b-9bf8-684f8decd693","Type":"ContainerDied","Data":"f8917efdef280fbafff5fdb9e12aca86bf80f62f4c44e47999714037435c5abb"} Apr 24 22:42:24.953180 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:24.953135 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" event={"ID":"528c314a-1af7-4b7b-9bf8-684f8decd693","Type":"ContainerStarted","Data":"ebb5ebbfdbede68a5f9bf34032963b98891ab7252434d4f02ac6597acbebfc9c"} Apr 24 22:42:24.953180 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:24.953186 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" event={"ID":"528c314a-1af7-4b7b-9bf8-684f8decd693","Type":"ContainerStarted","Data":"e0197efa243e846fec00eabd8e87a199faad6f58da3cd66a9e16782cdab94397"} Apr 24 22:42:24.953623 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:24.953198 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" event={"ID":"528c314a-1af7-4b7b-9bf8-684f8decd693","Type":"ContainerStarted","Data":"bfa8e9ce780d423a33af24921129b9b1691167d6c5bcdc7f8ec284c889de3858"} Apr 24 22:42:24.953623 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:24.953495 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" Apr 24 22:42:24.953623 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:24.953586 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" Apr 24 22:42:24.954775 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:24.954752 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 22:42:24.973157 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:24.973111 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podStartSLOduration=5.973097921 podStartE2EDuration="5.973097921s" podCreationTimestamp="2026-04-24 22:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:42:24.971238948 +0000 UTC m=+752.119544498" watchObservedRunningTime="2026-04-24 22:42:24.973097921 +0000 UTC m=+752.121403469" Apr 24 22:42:25.956463 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:25.956429 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" Apr 24 22:42:25.956929 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:25.956626 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 22:42:25.957525 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:25.957496 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:42:26.960007 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:26.959963 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 22:42:26.960441 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:26.960351 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:42:31.964300 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:31.964270 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" Apr 24 22:42:31.964835 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:31.964803 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 22:42:31.965152 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:31.965125 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:42:41.965579 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:41.965541 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 22:42:41.966102 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:41.966028 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:42:51.965284 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:51.965244 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 22:42:51.965696 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:42:51.965650 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:43:01.965131 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:01.965094 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 22:43:01.965597 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:01.965561 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:43:11.965242 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:11.965158 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 22:43:11.965661 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:11.965639 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:43:21.965165 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:21.965116 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 22:43:21.965651 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:21.965515 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:43:31.965756 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:31.965724 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" Apr 24 22:43:31.966297 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:31.966030 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" Apr 24 22:43:44.605583 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:44.605546 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-d7349-predictor-77567676df-jcnx9_7cc35065-8b02-4a65-b7c4-04c94f9ff71b/kserve-container/0.log" Apr 24 22:43:44.793520 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:44.793490 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh"] Apr 24 22:43:44.794042 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:44.793846 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="kserve-container" containerID="cri-o://bfa8e9ce780d423a33af24921129b9b1691167d6c5bcdc7f8ec284c889de3858" gracePeriod=30 Apr 24 22:43:44.794042 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:44.793871 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="agent" containerID="cri-o://ebb5ebbfdbede68a5f9bf34032963b98891ab7252434d4f02ac6597acbebfc9c" gracePeriod=30 Apr 24 22:43:44.794042 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:44.793935 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="kube-rbac-proxy" containerID="cri-o://e0197efa243e846fec00eabd8e87a199faad6f58da3cd66a9e16782cdab94397" gracePeriod=30 Apr 24 22:43:44.868749 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:44.868692 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2"] Apr 24 22:43:44.872257 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:44.872240 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" Apr 24 22:43:44.876455 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:44.876437 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-eb854-predictor-serving-cert\"" Apr 24 22:43:44.876691 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:44.876676 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-eb854-kube-rbac-proxy-sar-config\"" Apr 24 22:43:44.882345 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:44.882325 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2"] Apr 24 22:43:44.997650 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:44.997624 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gntcv\" (UniqueName: \"kubernetes.io/projected/b07a71e7-a076-457d-a1af-42e821516004-kube-api-access-gntcv\") pod \"isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2\" (UID: \"b07a71e7-a076-457d-a1af-42e821516004\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" Apr 24 22:43:44.997802 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:44.997706 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b07a71e7-a076-457d-a1af-42e821516004-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2\" (UID: \"b07a71e7-a076-457d-a1af-42e821516004\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" Apr 24 22:43:44.997802 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:44.997740 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-scale-raw-eb854-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b07a71e7-a076-457d-a1af-42e821516004-isvc-sklearn-scale-raw-eb854-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2\" (UID: \"b07a71e7-a076-457d-a1af-42e821516004\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" Apr 24 22:43:44.997802 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:44.997785 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b07a71e7-a076-457d-a1af-42e821516004-proxy-tls\") pod \"isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2\" (UID: \"b07a71e7-a076-457d-a1af-42e821516004\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" Apr 24 22:43:45.011249 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.011225 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9"] Apr 24 22:43:45.011465 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.011439 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9" podUID="7cc35065-8b02-4a65-b7c4-04c94f9ff71b" containerName="kserve-container" containerID="cri-o://e97527bb69f6c8c068afdc2f3b03e6dcee8d51fbddb3f5d83e8d1d5f3720161b" gracePeriod=30 Apr 24 22:43:45.011590 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.011491 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9" podUID="7cc35065-8b02-4a65-b7c4-04c94f9ff71b" containerName="kube-rbac-proxy" containerID="cri-o://0817db4500aa8a335e9a7d3706d1bc5b908adff4826bd220afa6a828687470d6" gracePeriod=30 Apr 24 22:43:45.098929 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.098907 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b07a71e7-a076-457d-a1af-42e821516004-proxy-tls\") pod \"isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2\" (UID: \"b07a71e7-a076-457d-a1af-42e821516004\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" Apr 24 22:43:45.099052 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.098950 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gntcv\" (UniqueName: \"kubernetes.io/projected/b07a71e7-a076-457d-a1af-42e821516004-kube-api-access-gntcv\") pod \"isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2\" (UID: \"b07a71e7-a076-457d-a1af-42e821516004\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" Apr 24 22:43:45.099052 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.098989 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b07a71e7-a076-457d-a1af-42e821516004-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2\" (UID: \"b07a71e7-a076-457d-a1af-42e821516004\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" Apr 24 22:43:45.099052 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.099031 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-scale-raw-eb854-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b07a71e7-a076-457d-a1af-42e821516004-isvc-sklearn-scale-raw-eb854-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2\" (UID: \"b07a71e7-a076-457d-a1af-42e821516004\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" Apr 24 22:43:45.099370 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.099351 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b07a71e7-a076-457d-a1af-42e821516004-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2\" (UID: \"b07a71e7-a076-457d-a1af-42e821516004\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" Apr 24 22:43:45.099611 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.099590 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-scale-raw-eb854-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b07a71e7-a076-457d-a1af-42e821516004-isvc-sklearn-scale-raw-eb854-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2\" (UID: \"b07a71e7-a076-457d-a1af-42e821516004\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" Apr 24 22:43:45.101212 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.101195 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b07a71e7-a076-457d-a1af-42e821516004-proxy-tls\") pod \"isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2\" (UID: \"b07a71e7-a076-457d-a1af-42e821516004\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" Apr 24 22:43:45.118992 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.118937 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gntcv\" (UniqueName: \"kubernetes.io/projected/b07a71e7-a076-457d-a1af-42e821516004-kube-api-access-gntcv\") pod \"isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2\" (UID: \"b07a71e7-a076-457d-a1af-42e821516004\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" Apr 24 22:43:45.182330 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.182302 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" Apr 24 22:43:45.202366 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.202334 2572 generic.go:358] "Generic (PLEG): container finished" podID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerID="e0197efa243e846fec00eabd8e87a199faad6f58da3cd66a9e16782cdab94397" exitCode=2 Apr 24 22:43:45.202472 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.202406 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" event={"ID":"528c314a-1af7-4b7b-9bf8-684f8decd693","Type":"ContainerDied","Data":"e0197efa243e846fec00eabd8e87a199faad6f58da3cd66a9e16782cdab94397"} Apr 24 22:43:45.204422 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.204402 2572 generic.go:358] "Generic (PLEG): container finished" podID="7cc35065-8b02-4a65-b7c4-04c94f9ff71b" containerID="0817db4500aa8a335e9a7d3706d1bc5b908adff4826bd220afa6a828687470d6" exitCode=2 Apr 24 22:43:45.204422 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.204423 2572 generic.go:358] "Generic (PLEG): container finished" podID="7cc35065-8b02-4a65-b7c4-04c94f9ff71b" containerID="e97527bb69f6c8c068afdc2f3b03e6dcee8d51fbddb3f5d83e8d1d5f3720161b" exitCode=2 Apr 24 22:43:45.204577 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.204467 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9" event={"ID":"7cc35065-8b02-4a65-b7c4-04c94f9ff71b","Type":"ContainerDied","Data":"0817db4500aa8a335e9a7d3706d1bc5b908adff4826bd220afa6a828687470d6"} Apr 24 22:43:45.204577 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.204500 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9" event={"ID":"7cc35065-8b02-4a65-b7c4-04c94f9ff71b","Type":"ContainerDied","Data":"e97527bb69f6c8c068afdc2f3b03e6dcee8d51fbddb3f5d83e8d1d5f3720161b"} Apr 24 22:43:45.240860 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.240835 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9" Apr 24 22:43:45.301007 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.300983 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7cc35065-8b02-4a65-b7c4-04c94f9ff71b-proxy-tls\") pod \"7cc35065-8b02-4a65-b7c4-04c94f9ff71b\" (UID: \"7cc35065-8b02-4a65-b7c4-04c94f9ff71b\") " Apr 24 22:43:45.301141 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.301117 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-raw-d7349-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7cc35065-8b02-4a65-b7c4-04c94f9ff71b-message-dumper-raw-d7349-kube-rbac-proxy-sar-config\") pod \"7cc35065-8b02-4a65-b7c4-04c94f9ff71b\" (UID: \"7cc35065-8b02-4a65-b7c4-04c94f9ff71b\") " Apr 24 22:43:45.301206 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.301186 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vd76\" (UniqueName: \"kubernetes.io/projected/7cc35065-8b02-4a65-b7c4-04c94f9ff71b-kube-api-access-6vd76\") pod \"7cc35065-8b02-4a65-b7c4-04c94f9ff71b\" (UID: \"7cc35065-8b02-4a65-b7c4-04c94f9ff71b\") " Apr 24 22:43:45.301522 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.301477 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cc35065-8b02-4a65-b7c4-04c94f9ff71b-message-dumper-raw-d7349-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-raw-d7349-kube-rbac-proxy-sar-config") pod "7cc35065-8b02-4a65-b7c4-04c94f9ff71b" (UID: "7cc35065-8b02-4a65-b7c4-04c94f9ff71b"). InnerVolumeSpecName "message-dumper-raw-d7349-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:43:45.303483 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.303453 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cc35065-8b02-4a65-b7c4-04c94f9ff71b-kube-api-access-6vd76" (OuterVolumeSpecName: "kube-api-access-6vd76") pod "7cc35065-8b02-4a65-b7c4-04c94f9ff71b" (UID: "7cc35065-8b02-4a65-b7c4-04c94f9ff71b"). InnerVolumeSpecName "kube-api-access-6vd76". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:43:45.303595 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.303525 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cc35065-8b02-4a65-b7c4-04c94f9ff71b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7cc35065-8b02-4a65-b7c4-04c94f9ff71b" (UID: "7cc35065-8b02-4a65-b7c4-04c94f9ff71b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:43:45.304303 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.304232 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2"] Apr 24 22:43:45.306311 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:43:45.306286 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb07a71e7_a076_457d_a1af_42e821516004.slice/crio-edd52ae8b0d8b02fbebfaf95057218a6bd4c13f37cf8e0195b3d56d2f5691a42 WatchSource:0}: Error finding container edd52ae8b0d8b02fbebfaf95057218a6bd4c13f37cf8e0195b3d56d2f5691a42: Status 404 returned error can't find the container with id edd52ae8b0d8b02fbebfaf95057218a6bd4c13f37cf8e0195b3d56d2f5691a42 Apr 24 22:43:45.401804 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.401758 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7cc35065-8b02-4a65-b7c4-04c94f9ff71b-proxy-tls\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:43:45.401804 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.401780 2572 reconciler_common.go:299] "Volume detached for volume \"message-dumper-raw-d7349-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7cc35065-8b02-4a65-b7c4-04c94f9ff71b-message-dumper-raw-d7349-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:43:45.401804 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:45.401790 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6vd76\" (UniqueName: \"kubernetes.io/projected/7cc35065-8b02-4a65-b7c4-04c94f9ff71b-kube-api-access-6vd76\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:43:46.213044 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:46.212935 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" event={"ID":"b07a71e7-a076-457d-a1af-42e821516004","Type":"ContainerStarted","Data":"230e9fc871c2452c1bd09e9f69d56b6dd1a04f611ea531a724c3b12c61a3f90e"} Apr 24 22:43:46.213044 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:46.212985 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" event={"ID":"b07a71e7-a076-457d-a1af-42e821516004","Type":"ContainerStarted","Data":"edd52ae8b0d8b02fbebfaf95057218a6bd4c13f37cf8e0195b3d56d2f5691a42"} Apr 24 22:43:46.215238 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:46.215203 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9" event={"ID":"7cc35065-8b02-4a65-b7c4-04c94f9ff71b","Type":"ContainerDied","Data":"7dcedff4fa4bd824ce8351629111b327d47bf0c56511ac9a9211ab8c95419087"} Apr 24 22:43:46.215369 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:46.215251 2572 scope.go:117] "RemoveContainer" containerID="0817db4500aa8a335e9a7d3706d1bc5b908adff4826bd220afa6a828687470d6" Apr 24 22:43:46.215500 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:46.215480 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9" Apr 24 22:43:46.222991 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:46.222965 2572 scope.go:117] "RemoveContainer" containerID="e97527bb69f6c8c068afdc2f3b03e6dcee8d51fbddb3f5d83e8d1d5f3720161b" Apr 24 22:43:46.253334 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:46.253310 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9"] Apr 24 22:43:46.257516 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:46.257496 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-d7349-predictor-77567676df-jcnx9"] Apr 24 22:43:46.960079 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:46.960039 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.31:8643/healthz\": dial tcp 10.134.0.31:8643: connect: connection refused" Apr 24 22:43:47.418155 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:47.418115 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cc35065-8b02-4a65-b7c4-04c94f9ff71b" path="/var/lib/kubelet/pods/7cc35065-8b02-4a65-b7c4-04c94f9ff71b/volumes" Apr 24 22:43:49.228244 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:49.228162 2572 generic.go:358] "Generic (PLEG): container finished" podID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerID="bfa8e9ce780d423a33af24921129b9b1691167d6c5bcdc7f8ec284c889de3858" exitCode=0 Apr 24 22:43:49.228582 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:49.228232 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" event={"ID":"528c314a-1af7-4b7b-9bf8-684f8decd693","Type":"ContainerDied","Data":"bfa8e9ce780d423a33af24921129b9b1691167d6c5bcdc7f8ec284c889de3858"} Apr 24 22:43:49.229709 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:49.229683 2572 generic.go:358] "Generic (PLEG): container finished" podID="b07a71e7-a076-457d-a1af-42e821516004" containerID="230e9fc871c2452c1bd09e9f69d56b6dd1a04f611ea531a724c3b12c61a3f90e" exitCode=0 Apr 24 22:43:49.229834 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:49.229763 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" event={"ID":"b07a71e7-a076-457d-a1af-42e821516004","Type":"ContainerDied","Data":"230e9fc871c2452c1bd09e9f69d56b6dd1a04f611ea531a724c3b12c61a3f90e"} Apr 24 22:43:50.235314 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:50.235282 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" event={"ID":"b07a71e7-a076-457d-a1af-42e821516004","Type":"ContainerStarted","Data":"e2acef23d16d1d90bb8c241f8ab1d94093407baee227b7e76c5d4f137a613882"} Apr 24 22:43:50.235743 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:50.235324 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" event={"ID":"b07a71e7-a076-457d-a1af-42e821516004","Type":"ContainerStarted","Data":"7f42dfb3352dab2da2bafe21f205dfcf55d4f7391a2c8f39ef12262a5b65b660"} Apr 24 22:43:50.235743 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:50.235600 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" Apr 24 22:43:50.257965 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:50.257918 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" podStartSLOduration=6.257904147 podStartE2EDuration="6.257904147s" podCreationTimestamp="2026-04-24 22:43:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:43:50.256193804 +0000 UTC m=+837.404499363" watchObservedRunningTime="2026-04-24 22:43:50.257904147 +0000 UTC m=+837.406209767" Apr 24 22:43:51.238732 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:51.238706 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" Apr 24 22:43:51.239916 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:51.239890 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" podUID="b07a71e7-a076-457d-a1af-42e821516004" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 22:43:51.960806 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:51.960764 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.31:8643/healthz\": dial tcp 10.134.0.31:8643: connect: connection refused" Apr 24 22:43:51.965135 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:51.965110 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 22:43:51.965504 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:51.965485 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:43:52.242573 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:52.242542 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" podUID="b07a71e7-a076-457d-a1af-42e821516004" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 22:43:56.960806 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:56.960762 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.31:8643/healthz\": dial tcp 10.134.0.31:8643: connect: connection refused" Apr 24 22:43:56.961192 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:56.960918 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" Apr 24 22:43:57.248645 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:57.248622 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" Apr 24 22:43:57.249142 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:43:57.249118 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" podUID="b07a71e7-a076-457d-a1af-42e821516004" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 22:44:01.960327 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:01.960291 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.31:8643/healthz\": dial tcp 10.134.0.31:8643: connect: connection refused" Apr 24 22:44:01.965643 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:01.965616 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 22:44:01.966005 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:01.965979 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:44:06.960680 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:06.960639 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.31:8643/healthz\": dial tcp 10.134.0.31:8643: connect: connection refused" Apr 24 22:44:07.249610 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:07.249577 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" podUID="b07a71e7-a076-457d-a1af-42e821516004" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 22:44:11.960573 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:11.960530 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.31:8643/healthz\": dial tcp 10.134.0.31:8643: connect: connection refused" Apr 24 22:44:11.964817 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:11.964779 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 22:44:11.964942 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:11.964926 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" Apr 24 22:44:11.965239 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:11.965212 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:44:11.965325 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:11.965313 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" Apr 24 22:44:14.943201 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:14.943180 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" Apr 24 22:44:15.128513 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.128446 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/528c314a-1af7-4b7b-9bf8-684f8decd693-kserve-provision-location\") pod \"528c314a-1af7-4b7b-9bf8-684f8decd693\" (UID: \"528c314a-1af7-4b7b-9bf8-684f8decd693\") " Apr 24 22:44:15.128513 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.128488 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/528c314a-1af7-4b7b-9bf8-684f8decd693-proxy-tls\") pod \"528c314a-1af7-4b7b-9bf8-684f8decd693\" (UID: \"528c314a-1af7-4b7b-9bf8-684f8decd693\") " Apr 24 22:44:15.128672 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.128536 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6hln\" (UniqueName: \"kubernetes.io/projected/528c314a-1af7-4b7b-9bf8-684f8decd693-kube-api-access-r6hln\") pod \"528c314a-1af7-4b7b-9bf8-684f8decd693\" (UID: \"528c314a-1af7-4b7b-9bf8-684f8decd693\") " Apr 24 22:44:15.128672 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.128555 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-raw-d7349-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/528c314a-1af7-4b7b-9bf8-684f8decd693-isvc-logger-raw-d7349-kube-rbac-proxy-sar-config\") pod \"528c314a-1af7-4b7b-9bf8-684f8decd693\" (UID: \"528c314a-1af7-4b7b-9bf8-684f8decd693\") " Apr 24 22:44:15.128882 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.128792 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/528c314a-1af7-4b7b-9bf8-684f8decd693-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "528c314a-1af7-4b7b-9bf8-684f8decd693" (UID: "528c314a-1af7-4b7b-9bf8-684f8decd693"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:44:15.128963 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.128938 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/528c314a-1af7-4b7b-9bf8-684f8decd693-isvc-logger-raw-d7349-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-raw-d7349-kube-rbac-proxy-sar-config") pod "528c314a-1af7-4b7b-9bf8-684f8decd693" (UID: "528c314a-1af7-4b7b-9bf8-684f8decd693"). InnerVolumeSpecName "isvc-logger-raw-d7349-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:44:15.130513 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.130488 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/528c314a-1af7-4b7b-9bf8-684f8decd693-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "528c314a-1af7-4b7b-9bf8-684f8decd693" (UID: "528c314a-1af7-4b7b-9bf8-684f8decd693"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:44:15.130706 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.130680 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/528c314a-1af7-4b7b-9bf8-684f8decd693-kube-api-access-r6hln" (OuterVolumeSpecName: "kube-api-access-r6hln") pod "528c314a-1af7-4b7b-9bf8-684f8decd693" (UID: "528c314a-1af7-4b7b-9bf8-684f8decd693"). InnerVolumeSpecName "kube-api-access-r6hln". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:44:15.229290 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.229266 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r6hln\" (UniqueName: \"kubernetes.io/projected/528c314a-1af7-4b7b-9bf8-684f8decd693-kube-api-access-r6hln\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:44:15.229290 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.229288 2572 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-raw-d7349-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/528c314a-1af7-4b7b-9bf8-684f8decd693-isvc-logger-raw-d7349-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:44:15.229452 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.229298 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/528c314a-1af7-4b7b-9bf8-684f8decd693-kserve-provision-location\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:44:15.229452 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.229308 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/528c314a-1af7-4b7b-9bf8-684f8decd693-proxy-tls\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:44:15.317390 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.317363 2572 generic.go:358] "Generic (PLEG): container finished" podID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerID="ebb5ebbfdbede68a5f9bf34032963b98891ab7252434d4f02ac6597acbebfc9c" exitCode=0 Apr 24 22:44:15.317480 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.317447 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" Apr 24 22:44:15.317480 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.317454 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" event={"ID":"528c314a-1af7-4b7b-9bf8-684f8decd693","Type":"ContainerDied","Data":"ebb5ebbfdbede68a5f9bf34032963b98891ab7252434d4f02ac6597acbebfc9c"} Apr 24 22:44:15.317559 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.317491 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh" event={"ID":"528c314a-1af7-4b7b-9bf8-684f8decd693","Type":"ContainerDied","Data":"9f3805d0d44991b82c8ba4e57dc7e7ee988f3e371969f137a8b569e4b0c5b7a5"} Apr 24 22:44:15.317559 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.317509 2572 scope.go:117] "RemoveContainer" containerID="ebb5ebbfdbede68a5f9bf34032963b98891ab7252434d4f02ac6597acbebfc9c" Apr 24 22:44:15.325653 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.325639 2572 scope.go:117] "RemoveContainer" containerID="e0197efa243e846fec00eabd8e87a199faad6f58da3cd66a9e16782cdab94397" Apr 24 22:44:15.332094 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.332071 2572 scope.go:117] "RemoveContainer" containerID="bfa8e9ce780d423a33af24921129b9b1691167d6c5bcdc7f8ec284c889de3858" Apr 24 22:44:15.338932 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.338916 2572 scope.go:117] "RemoveContainer" containerID="f8917efdef280fbafff5fdb9e12aca86bf80f62f4c44e47999714037435c5abb" Apr 24 22:44:15.339620 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.339604 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh"] Apr 24 22:44:15.344465 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.344446 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-d7349-predictor-7d5d69f59d-bxknh"] Apr 24 22:44:15.345879 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.345864 2572 scope.go:117] "RemoveContainer" containerID="ebb5ebbfdbede68a5f9bf34032963b98891ab7252434d4f02ac6597acbebfc9c" Apr 24 22:44:15.346177 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:44:15.346159 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebb5ebbfdbede68a5f9bf34032963b98891ab7252434d4f02ac6597acbebfc9c\": container with ID starting with ebb5ebbfdbede68a5f9bf34032963b98891ab7252434d4f02ac6597acbebfc9c not found: ID does not exist" containerID="ebb5ebbfdbede68a5f9bf34032963b98891ab7252434d4f02ac6597acbebfc9c" Apr 24 22:44:15.346251 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.346183 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebb5ebbfdbede68a5f9bf34032963b98891ab7252434d4f02ac6597acbebfc9c"} err="failed to get container status \"ebb5ebbfdbede68a5f9bf34032963b98891ab7252434d4f02ac6597acbebfc9c\": rpc error: code = NotFound desc = could not find container \"ebb5ebbfdbede68a5f9bf34032963b98891ab7252434d4f02ac6597acbebfc9c\": container with ID starting with ebb5ebbfdbede68a5f9bf34032963b98891ab7252434d4f02ac6597acbebfc9c not found: ID does not exist" Apr 24 22:44:15.346251 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.346199 2572 scope.go:117] "RemoveContainer" containerID="e0197efa243e846fec00eabd8e87a199faad6f58da3cd66a9e16782cdab94397" Apr 24 22:44:15.346411 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:44:15.346394 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0197efa243e846fec00eabd8e87a199faad6f58da3cd66a9e16782cdab94397\": container with ID starting with e0197efa243e846fec00eabd8e87a199faad6f58da3cd66a9e16782cdab94397 not found: ID does not exist" containerID="e0197efa243e846fec00eabd8e87a199faad6f58da3cd66a9e16782cdab94397" Apr 24 22:44:15.346457 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.346420 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0197efa243e846fec00eabd8e87a199faad6f58da3cd66a9e16782cdab94397"} err="failed to get container status \"e0197efa243e846fec00eabd8e87a199faad6f58da3cd66a9e16782cdab94397\": rpc error: code = NotFound desc = could not find container \"e0197efa243e846fec00eabd8e87a199faad6f58da3cd66a9e16782cdab94397\": container with ID starting with e0197efa243e846fec00eabd8e87a199faad6f58da3cd66a9e16782cdab94397 not found: ID does not exist" Apr 24 22:44:15.346457 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.346445 2572 scope.go:117] "RemoveContainer" containerID="bfa8e9ce780d423a33af24921129b9b1691167d6c5bcdc7f8ec284c889de3858" Apr 24 22:44:15.346687 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:44:15.346670 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfa8e9ce780d423a33af24921129b9b1691167d6c5bcdc7f8ec284c889de3858\": container with ID starting with bfa8e9ce780d423a33af24921129b9b1691167d6c5bcdc7f8ec284c889de3858 not found: ID does not exist" containerID="bfa8e9ce780d423a33af24921129b9b1691167d6c5bcdc7f8ec284c889de3858" Apr 24 22:44:15.346759 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.346707 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfa8e9ce780d423a33af24921129b9b1691167d6c5bcdc7f8ec284c889de3858"} err="failed to get container status \"bfa8e9ce780d423a33af24921129b9b1691167d6c5bcdc7f8ec284c889de3858\": rpc error: code = NotFound desc = could not find container \"bfa8e9ce780d423a33af24921129b9b1691167d6c5bcdc7f8ec284c889de3858\": container with ID starting with bfa8e9ce780d423a33af24921129b9b1691167d6c5bcdc7f8ec284c889de3858 not found: ID does not exist" Apr 24 22:44:15.346759 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.346731 2572 scope.go:117] "RemoveContainer" containerID="f8917efdef280fbafff5fdb9e12aca86bf80f62f4c44e47999714037435c5abb" Apr 24 22:44:15.346958 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:44:15.346942 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8917efdef280fbafff5fdb9e12aca86bf80f62f4c44e47999714037435c5abb\": container with ID starting with f8917efdef280fbafff5fdb9e12aca86bf80f62f4c44e47999714037435c5abb not found: ID does not exist" containerID="f8917efdef280fbafff5fdb9e12aca86bf80f62f4c44e47999714037435c5abb" Apr 24 22:44:15.347001 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.346963 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8917efdef280fbafff5fdb9e12aca86bf80f62f4c44e47999714037435c5abb"} err="failed to get container status \"f8917efdef280fbafff5fdb9e12aca86bf80f62f4c44e47999714037435c5abb\": rpc error: code = NotFound desc = could not find container \"f8917efdef280fbafff5fdb9e12aca86bf80f62f4c44e47999714037435c5abb\": container with ID starting with f8917efdef280fbafff5fdb9e12aca86bf80f62f4c44e47999714037435c5abb not found: ID does not exist" Apr 24 22:44:15.417528 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:15.417482 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" path="/var/lib/kubelet/pods/528c314a-1af7-4b7b-9bf8-684f8decd693/volumes" Apr 24 22:44:17.249219 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:17.249182 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" podUID="b07a71e7-a076-457d-a1af-42e821516004" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 22:44:27.249481 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:27.249436 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" podUID="b07a71e7-a076-457d-a1af-42e821516004" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 22:44:37.249159 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:37.249110 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" podUID="b07a71e7-a076-457d-a1af-42e821516004" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 22:44:47.249828 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:47.249737 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" podUID="b07a71e7-a076-457d-a1af-42e821516004" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 22:44:57.250146 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:44:57.250103 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" podUID="b07a71e7-a076-457d-a1af-42e821516004" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 22:45:07.249815 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:07.249768 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" podUID="b07a71e7-a076-457d-a1af-42e821516004" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 22:45:13.416417 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:13.416373 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" podUID="b07a71e7-a076-457d-a1af-42e821516004" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 22:45:23.416583 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:23.416531 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" podUID="b07a71e7-a076-457d-a1af-42e821516004" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 22:45:33.416388 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:33.416339 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" podUID="b07a71e7-a076-457d-a1af-42e821516004" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 22:45:43.416539 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:43.416493 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" podUID="b07a71e7-a076-457d-a1af-42e821516004" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 22:45:53.423156 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:53.423125 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" Apr 24 22:45:55.013683 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.013642 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2"] Apr 24 22:45:55.014136 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.014055 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" podUID="b07a71e7-a076-457d-a1af-42e821516004" containerName="kserve-container" containerID="cri-o://7f42dfb3352dab2da2bafe21f205dfcf55d4f7391a2c8f39ef12262a5b65b660" gracePeriod=30 Apr 24 22:45:55.014218 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.014186 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" podUID="b07a71e7-a076-457d-a1af-42e821516004" containerName="kube-rbac-proxy" containerID="cri-o://e2acef23d16d1d90bb8c241f8ab1d94093407baee227b7e76c5d4f137a613882" gracePeriod=30 Apr 24 22:45:55.121327 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.121296 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh"] Apr 24 22:45:55.121654 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.121641 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7cc35065-8b02-4a65-b7c4-04c94f9ff71b" containerName="kube-rbac-proxy" Apr 24 22:45:55.121654 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.121655 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc35065-8b02-4a65-b7c4-04c94f9ff71b" containerName="kube-rbac-proxy" Apr 24 22:45:55.121750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.121669 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7cc35065-8b02-4a65-b7c4-04c94f9ff71b" containerName="kserve-container" Apr 24 22:45:55.121750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.121675 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc35065-8b02-4a65-b7c4-04c94f9ff71b" containerName="kserve-container" Apr 24 22:45:55.121750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.121682 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="kube-rbac-proxy" Apr 24 22:45:55.121750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.121687 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="kube-rbac-proxy" Apr 24 22:45:55.121750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.121703 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="storage-initializer" Apr 24 22:45:55.121750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.121710 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="storage-initializer" Apr 24 22:45:55.121750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.121715 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="kserve-container" Apr 24 22:45:55.121750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.121721 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="kserve-container" Apr 24 22:45:55.121750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.121728 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="agent" Apr 24 22:45:55.121750 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.121733 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="agent" Apr 24 22:45:55.122121 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.121786 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="agent" Apr 24 22:45:55.122121 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.121794 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7cc35065-8b02-4a65-b7c4-04c94f9ff71b" containerName="kserve-container" Apr 24 22:45:55.122121 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.121803 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="kserve-container" Apr 24 22:45:55.122121 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.121811 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7cc35065-8b02-4a65-b7c4-04c94f9ff71b" containerName="kube-rbac-proxy" Apr 24 22:45:55.122121 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.121817 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="528c314a-1af7-4b7b-9bf8-684f8decd693" containerName="kube-rbac-proxy" Apr 24 22:45:55.124958 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.124939 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" Apr 24 22:45:55.127408 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.127386 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-51530a-kube-rbac-proxy-sar-config\"" Apr 24 22:45:55.127527 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.127398 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-51530a-predictor-serving-cert\"" Apr 24 22:45:55.136103 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.136079 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh"] Apr 24 22:45:55.266904 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.266833 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-51530a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce65cd98-c037-43cf-8e8d-4a5027764ab6-isvc-primary-51530a-kube-rbac-proxy-sar-config\") pod \"isvc-primary-51530a-predictor-77f766b669-jfmsh\" (UID: \"ce65cd98-c037-43cf-8e8d-4a5027764ab6\") " pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" Apr 24 22:45:55.266904 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.266880 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce65cd98-c037-43cf-8e8d-4a5027764ab6-proxy-tls\") pod \"isvc-primary-51530a-predictor-77f766b669-jfmsh\" (UID: \"ce65cd98-c037-43cf-8e8d-4a5027764ab6\") " pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" Apr 24 22:45:55.266904 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.266901 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce65cd98-c037-43cf-8e8d-4a5027764ab6-kserve-provision-location\") pod \"isvc-primary-51530a-predictor-77f766b669-jfmsh\" (UID: \"ce65cd98-c037-43cf-8e8d-4a5027764ab6\") " pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" Apr 24 22:45:55.267150 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.266950 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk5cm\" (UniqueName: \"kubernetes.io/projected/ce65cd98-c037-43cf-8e8d-4a5027764ab6-kube-api-access-pk5cm\") pod \"isvc-primary-51530a-predictor-77f766b669-jfmsh\" (UID: \"ce65cd98-c037-43cf-8e8d-4a5027764ab6\") " pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" Apr 24 22:45:55.368319 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.368292 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-51530a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce65cd98-c037-43cf-8e8d-4a5027764ab6-isvc-primary-51530a-kube-rbac-proxy-sar-config\") pod \"isvc-primary-51530a-predictor-77f766b669-jfmsh\" (UID: \"ce65cd98-c037-43cf-8e8d-4a5027764ab6\") " pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" Apr 24 22:45:55.368456 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.368339 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce65cd98-c037-43cf-8e8d-4a5027764ab6-proxy-tls\") pod \"isvc-primary-51530a-predictor-77f766b669-jfmsh\" (UID: \"ce65cd98-c037-43cf-8e8d-4a5027764ab6\") " pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" Apr 24 22:45:55.368456 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.368360 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce65cd98-c037-43cf-8e8d-4a5027764ab6-kserve-provision-location\") pod \"isvc-primary-51530a-predictor-77f766b669-jfmsh\" (UID: \"ce65cd98-c037-43cf-8e8d-4a5027764ab6\") " pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" Apr 24 22:45:55.368456 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.368392 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pk5cm\" (UniqueName: \"kubernetes.io/projected/ce65cd98-c037-43cf-8e8d-4a5027764ab6-kube-api-access-pk5cm\") pod \"isvc-primary-51530a-predictor-77f766b669-jfmsh\" (UID: \"ce65cd98-c037-43cf-8e8d-4a5027764ab6\") " pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" Apr 24 22:45:55.368601 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:45:55.368448 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-primary-51530a-predictor-serving-cert: secret "isvc-primary-51530a-predictor-serving-cert" not found Apr 24 22:45:55.368601 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:45:55.368520 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce65cd98-c037-43cf-8e8d-4a5027764ab6-proxy-tls podName:ce65cd98-c037-43cf-8e8d-4a5027764ab6 nodeName:}" failed. No retries permitted until 2026-04-24 22:45:55.868498937 +0000 UTC m=+963.016804468 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/ce65cd98-c037-43cf-8e8d-4a5027764ab6-proxy-tls") pod "isvc-primary-51530a-predictor-77f766b669-jfmsh" (UID: "ce65cd98-c037-43cf-8e8d-4a5027764ab6") : secret "isvc-primary-51530a-predictor-serving-cert" not found Apr 24 22:45:55.368786 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.368763 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce65cd98-c037-43cf-8e8d-4a5027764ab6-kserve-provision-location\") pod \"isvc-primary-51530a-predictor-77f766b669-jfmsh\" (UID: \"ce65cd98-c037-43cf-8e8d-4a5027764ab6\") " pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" Apr 24 22:45:55.368977 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.368961 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-51530a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce65cd98-c037-43cf-8e8d-4a5027764ab6-isvc-primary-51530a-kube-rbac-proxy-sar-config\") pod \"isvc-primary-51530a-predictor-77f766b669-jfmsh\" (UID: \"ce65cd98-c037-43cf-8e8d-4a5027764ab6\") " pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" Apr 24 22:45:55.377774 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.377750 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk5cm\" (UniqueName: \"kubernetes.io/projected/ce65cd98-c037-43cf-8e8d-4a5027764ab6-kube-api-access-pk5cm\") pod \"isvc-primary-51530a-predictor-77f766b669-jfmsh\" (UID: \"ce65cd98-c037-43cf-8e8d-4a5027764ab6\") " pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" Apr 24 22:45:55.615409 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.615340 2572 generic.go:358] "Generic (PLEG): container finished" podID="b07a71e7-a076-457d-a1af-42e821516004" containerID="e2acef23d16d1d90bb8c241f8ab1d94093407baee227b7e76c5d4f137a613882" exitCode=2 Apr 24 22:45:55.615535 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.615414 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" event={"ID":"b07a71e7-a076-457d-a1af-42e821516004","Type":"ContainerDied","Data":"e2acef23d16d1d90bb8c241f8ab1d94093407baee227b7e76c5d4f137a613882"} Apr 24 22:45:55.873187 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.873129 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce65cd98-c037-43cf-8e8d-4a5027764ab6-proxy-tls\") pod \"isvc-primary-51530a-predictor-77f766b669-jfmsh\" (UID: \"ce65cd98-c037-43cf-8e8d-4a5027764ab6\") " pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" Apr 24 22:45:55.875538 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:55.875507 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce65cd98-c037-43cf-8e8d-4a5027764ab6-proxy-tls\") pod \"isvc-primary-51530a-predictor-77f766b669-jfmsh\" (UID: \"ce65cd98-c037-43cf-8e8d-4a5027764ab6\") " pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" Apr 24 22:45:56.035669 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:56.035648 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" Apr 24 22:45:56.150023 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:56.149987 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh"] Apr 24 22:45:56.152437 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:45:56.152409 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce65cd98_c037_43cf_8e8d_4a5027764ab6.slice/crio-8afec3f7e58f496171ed0b4b272b534f888e9d758526deb93e47c23573323d8b WatchSource:0}: Error finding container 8afec3f7e58f496171ed0b4b272b534f888e9d758526deb93e47c23573323d8b: Status 404 returned error can't find the container with id 8afec3f7e58f496171ed0b4b272b534f888e9d758526deb93e47c23573323d8b Apr 24 22:45:56.619656 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:56.619619 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" event={"ID":"ce65cd98-c037-43cf-8e8d-4a5027764ab6","Type":"ContainerStarted","Data":"e2e403400a815fc1a1b889da60dda6fd0f23bbda5e661a98aad41767c4a39556"} Apr 24 22:45:56.619656 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:56.619661 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" event={"ID":"ce65cd98-c037-43cf-8e8d-4a5027764ab6","Type":"ContainerStarted","Data":"8afec3f7e58f496171ed0b4b272b534f888e9d758526deb93e47c23573323d8b"} Apr 24 22:45:57.243616 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:45:57.243582 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" podUID="b07a71e7-a076-457d-a1af-42e821516004" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.32:8643/healthz\": dial tcp 10.134.0.32:8643: connect: connection refused" Apr 24 22:46:00.632579 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:00.632544 2572 generic.go:358] "Generic (PLEG): container finished" podID="ce65cd98-c037-43cf-8e8d-4a5027764ab6" containerID="e2e403400a815fc1a1b889da60dda6fd0f23bbda5e661a98aad41767c4a39556" exitCode=0 Apr 24 22:46:00.632917 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:00.632618 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" event={"ID":"ce65cd98-c037-43cf-8e8d-4a5027764ab6","Type":"ContainerDied","Data":"e2e403400a815fc1a1b889da60dda6fd0f23bbda5e661a98aad41767c4a39556"} Apr 24 22:46:01.638279 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:01.638246 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" event={"ID":"ce65cd98-c037-43cf-8e8d-4a5027764ab6","Type":"ContainerStarted","Data":"28e4ca490b3d57407872554ec413b9a99c35cfad7c6f2bbccc01ba107a5c610c"} Apr 24 22:46:01.638279 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:01.638283 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" event={"ID":"ce65cd98-c037-43cf-8e8d-4a5027764ab6","Type":"ContainerStarted","Data":"1f12ed1fd79574516745a45806e5a418d4278ac0603d3480dd1eabe1246e5215"} Apr 24 22:46:01.638683 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:01.638621 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" Apr 24 22:46:01.638747 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:01.638726 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" Apr 24 22:46:01.639866 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:01.639835 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" podUID="ce65cd98-c037-43cf-8e8d-4a5027764ab6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 22:46:01.661321 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:01.661276 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" podStartSLOduration=6.661266498 podStartE2EDuration="6.661266498s" podCreationTimestamp="2026-04-24 22:45:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:46:01.659949259 +0000 UTC m=+968.808254806" watchObservedRunningTime="2026-04-24 22:46:01.661266498 +0000 UTC m=+968.809572046" Apr 24 22:46:02.242994 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:02.242955 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" podUID="b07a71e7-a076-457d-a1af-42e821516004" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.32:8643/healthz\": dial tcp 10.134.0.32:8643: connect: connection refused" Apr 24 22:46:02.642308 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:02.642214 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" podUID="ce65cd98-c037-43cf-8e8d-4a5027764ab6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 22:46:03.151742 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.151720 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" Apr 24 22:46:03.227179 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.227130 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gntcv\" (UniqueName: \"kubernetes.io/projected/b07a71e7-a076-457d-a1af-42e821516004-kube-api-access-gntcv\") pod \"b07a71e7-a076-457d-a1af-42e821516004\" (UID: \"b07a71e7-a076-457d-a1af-42e821516004\") " Apr 24 22:46:03.227179 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.227178 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-scale-raw-eb854-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b07a71e7-a076-457d-a1af-42e821516004-isvc-sklearn-scale-raw-eb854-kube-rbac-proxy-sar-config\") pod \"b07a71e7-a076-457d-a1af-42e821516004\" (UID: \"b07a71e7-a076-457d-a1af-42e821516004\") " Apr 24 22:46:03.227317 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.227213 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b07a71e7-a076-457d-a1af-42e821516004-proxy-tls\") pod \"b07a71e7-a076-457d-a1af-42e821516004\" (UID: \"b07a71e7-a076-457d-a1af-42e821516004\") " Apr 24 22:46:03.227352 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.227320 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b07a71e7-a076-457d-a1af-42e821516004-kserve-provision-location\") pod \"b07a71e7-a076-457d-a1af-42e821516004\" (UID: \"b07a71e7-a076-457d-a1af-42e821516004\") " Apr 24 22:46:03.227652 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.227625 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b07a71e7-a076-457d-a1af-42e821516004-isvc-sklearn-scale-raw-eb854-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-scale-raw-eb854-kube-rbac-proxy-sar-config") pod "b07a71e7-a076-457d-a1af-42e821516004" (UID: "b07a71e7-a076-457d-a1af-42e821516004"). InnerVolumeSpecName "isvc-sklearn-scale-raw-eb854-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:46:03.227769 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.227657 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b07a71e7-a076-457d-a1af-42e821516004-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b07a71e7-a076-457d-a1af-42e821516004" (UID: "b07a71e7-a076-457d-a1af-42e821516004"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:46:03.229177 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.229151 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b07a71e7-a076-457d-a1af-42e821516004-kube-api-access-gntcv" (OuterVolumeSpecName: "kube-api-access-gntcv") pod "b07a71e7-a076-457d-a1af-42e821516004" (UID: "b07a71e7-a076-457d-a1af-42e821516004"). InnerVolumeSpecName "kube-api-access-gntcv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:46:03.229251 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.229217 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b07a71e7-a076-457d-a1af-42e821516004-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b07a71e7-a076-457d-a1af-42e821516004" (UID: "b07a71e7-a076-457d-a1af-42e821516004"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:46:03.328478 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.328454 2572 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-scale-raw-eb854-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b07a71e7-a076-457d-a1af-42e821516004-isvc-sklearn-scale-raw-eb854-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:46:03.328478 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.328476 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b07a71e7-a076-457d-a1af-42e821516004-proxy-tls\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:46:03.328613 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.328488 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b07a71e7-a076-457d-a1af-42e821516004-kserve-provision-location\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:46:03.328613 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.328498 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gntcv\" (UniqueName: \"kubernetes.io/projected/b07a71e7-a076-457d-a1af-42e821516004-kube-api-access-gntcv\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:46:03.646988 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.646962 2572 generic.go:358] "Generic (PLEG): container finished" podID="b07a71e7-a076-457d-a1af-42e821516004" containerID="7f42dfb3352dab2da2bafe21f205dfcf55d4f7391a2c8f39ef12262a5b65b660" exitCode=0 Apr 24 22:46:03.647322 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.647056 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" event={"ID":"b07a71e7-a076-457d-a1af-42e821516004","Type":"ContainerDied","Data":"7f42dfb3352dab2da2bafe21f205dfcf55d4f7391a2c8f39ef12262a5b65b660"} Apr 24 22:46:03.647322 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.647090 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" event={"ID":"b07a71e7-a076-457d-a1af-42e821516004","Type":"ContainerDied","Data":"edd52ae8b0d8b02fbebfaf95057218a6bd4c13f37cf8e0195b3d56d2f5691a42"} Apr 24 22:46:03.647322 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.647104 2572 scope.go:117] "RemoveContainer" containerID="e2acef23d16d1d90bb8c241f8ab1d94093407baee227b7e76c5d4f137a613882" Apr 24 22:46:03.647322 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.647064 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2" Apr 24 22:46:03.654917 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.654898 2572 scope.go:117] "RemoveContainer" containerID="7f42dfb3352dab2da2bafe21f205dfcf55d4f7391a2c8f39ef12262a5b65b660" Apr 24 22:46:03.661737 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.661720 2572 scope.go:117] "RemoveContainer" containerID="230e9fc871c2452c1bd09e9f69d56b6dd1a04f611ea531a724c3b12c61a3f90e" Apr 24 22:46:03.668247 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.668227 2572 scope.go:117] "RemoveContainer" containerID="e2acef23d16d1d90bb8c241f8ab1d94093407baee227b7e76c5d4f137a613882" Apr 24 22:46:03.668556 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:46:03.668528 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2acef23d16d1d90bb8c241f8ab1d94093407baee227b7e76c5d4f137a613882\": container with ID starting with e2acef23d16d1d90bb8c241f8ab1d94093407baee227b7e76c5d4f137a613882 not found: ID does not exist" containerID="e2acef23d16d1d90bb8c241f8ab1d94093407baee227b7e76c5d4f137a613882" Apr 24 22:46:03.668625 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.668566 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2acef23d16d1d90bb8c241f8ab1d94093407baee227b7e76c5d4f137a613882"} err="failed to get container status \"e2acef23d16d1d90bb8c241f8ab1d94093407baee227b7e76c5d4f137a613882\": rpc error: code = NotFound desc = could not find container \"e2acef23d16d1d90bb8c241f8ab1d94093407baee227b7e76c5d4f137a613882\": container with ID starting with e2acef23d16d1d90bb8c241f8ab1d94093407baee227b7e76c5d4f137a613882 not found: ID does not exist" Apr 24 22:46:03.668625 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.668591 2572 scope.go:117] "RemoveContainer" containerID="7f42dfb3352dab2da2bafe21f205dfcf55d4f7391a2c8f39ef12262a5b65b660" Apr 24 22:46:03.668843 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:46:03.668826 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f42dfb3352dab2da2bafe21f205dfcf55d4f7391a2c8f39ef12262a5b65b660\": container with ID starting with 7f42dfb3352dab2da2bafe21f205dfcf55d4f7391a2c8f39ef12262a5b65b660 not found: ID does not exist" containerID="7f42dfb3352dab2da2bafe21f205dfcf55d4f7391a2c8f39ef12262a5b65b660" Apr 24 22:46:03.668906 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.668850 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f42dfb3352dab2da2bafe21f205dfcf55d4f7391a2c8f39ef12262a5b65b660"} err="failed to get container status \"7f42dfb3352dab2da2bafe21f205dfcf55d4f7391a2c8f39ef12262a5b65b660\": rpc error: code = NotFound desc = could not find container \"7f42dfb3352dab2da2bafe21f205dfcf55d4f7391a2c8f39ef12262a5b65b660\": container with ID starting with 7f42dfb3352dab2da2bafe21f205dfcf55d4f7391a2c8f39ef12262a5b65b660 not found: ID does not exist" Apr 24 22:46:03.668906 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.668871 2572 scope.go:117] "RemoveContainer" containerID="230e9fc871c2452c1bd09e9f69d56b6dd1a04f611ea531a724c3b12c61a3f90e" Apr 24 22:46:03.669198 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:46:03.669151 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"230e9fc871c2452c1bd09e9f69d56b6dd1a04f611ea531a724c3b12c61a3f90e\": container with ID starting with 230e9fc871c2452c1bd09e9f69d56b6dd1a04f611ea531a724c3b12c61a3f90e not found: ID does not exist" containerID="230e9fc871c2452c1bd09e9f69d56b6dd1a04f611ea531a724c3b12c61a3f90e" Apr 24 22:46:03.669198 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.669174 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"230e9fc871c2452c1bd09e9f69d56b6dd1a04f611ea531a724c3b12c61a3f90e"} err="failed to get container status \"230e9fc871c2452c1bd09e9f69d56b6dd1a04f611ea531a724c3b12c61a3f90e\": rpc error: code = NotFound desc = could not find container \"230e9fc871c2452c1bd09e9f69d56b6dd1a04f611ea531a724c3b12c61a3f90e\": container with ID starting with 230e9fc871c2452c1bd09e9f69d56b6dd1a04f611ea531a724c3b12c61a3f90e not found: ID does not exist" Apr 24 22:46:03.671515 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.671492 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2"] Apr 24 22:46:03.679831 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:03.679807 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-eb854-predictor-78dd5bdfb6-rzfw2"] Apr 24 22:46:05.417638 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:05.417607 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b07a71e7-a076-457d-a1af-42e821516004" path="/var/lib/kubelet/pods/b07a71e7-a076-457d-a1af-42e821516004/volumes" Apr 24 22:46:07.646915 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:07.646887 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" Apr 24 22:46:07.647501 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:07.647472 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" podUID="ce65cd98-c037-43cf-8e8d-4a5027764ab6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 22:46:17.648065 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:17.647963 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" podUID="ce65cd98-c037-43cf-8e8d-4a5027764ab6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 22:46:27.647479 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:27.647424 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" podUID="ce65cd98-c037-43cf-8e8d-4a5027764ab6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 22:46:37.647988 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:37.647950 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" podUID="ce65cd98-c037-43cf-8e8d-4a5027764ab6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 22:46:47.647766 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:47.647722 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" podUID="ce65cd98-c037-43cf-8e8d-4a5027764ab6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 22:46:57.647878 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:46:57.647840 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" podUID="ce65cd98-c037-43cf-8e8d-4a5027764ab6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 22:47:07.648179 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:07.648143 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" Apr 24 22:47:15.262339 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.262305 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j"] Apr 24 22:47:15.262708 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.262647 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b07a71e7-a076-457d-a1af-42e821516004" containerName="kserve-container" Apr 24 22:47:15.262708 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.262659 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b07a71e7-a076-457d-a1af-42e821516004" containerName="kserve-container" Apr 24 22:47:15.262708 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.262675 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b07a71e7-a076-457d-a1af-42e821516004" containerName="kube-rbac-proxy" Apr 24 22:47:15.262708 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.262682 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b07a71e7-a076-457d-a1af-42e821516004" containerName="kube-rbac-proxy" Apr 24 22:47:15.262708 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.262699 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b07a71e7-a076-457d-a1af-42e821516004" containerName="storage-initializer" Apr 24 22:47:15.262708 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.262706 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b07a71e7-a076-457d-a1af-42e821516004" containerName="storage-initializer" Apr 24 22:47:15.262942 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.262757 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b07a71e7-a076-457d-a1af-42e821516004" containerName="kserve-container" Apr 24 22:47:15.262942 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.262766 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b07a71e7-a076-457d-a1af-42e821516004" containerName="kube-rbac-proxy" Apr 24 22:47:15.265827 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.265808 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j" Apr 24 22:47:15.268724 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.268692 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-51530a-predictor-serving-cert\"" Apr 24 22:47:15.268882 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.268694 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-51530a\"" Apr 24 22:47:15.268882 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.268771 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-51530a-dockercfg-nwj46\"" Apr 24 22:47:15.268882 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.268855 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-51530a-kube-rbac-proxy-sar-config\"" Apr 24 22:47:15.269106 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.269090 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 22:47:15.277539 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.277515 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j"] Apr 24 22:47:15.343141 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.343116 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-std28\" (UniqueName: \"kubernetes.io/projected/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-kube-api-access-std28\") pod \"isvc-secondary-51530a-predictor-687574b7c4-tjj9j\" (UID: \"5cabed56-cc14-4772-b75b-1aca8bb1ca0b\") " pod="kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j" Apr 24 22:47:15.343261 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.343177 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-proxy-tls\") pod \"isvc-secondary-51530a-predictor-687574b7c4-tjj9j\" (UID: \"5cabed56-cc14-4772-b75b-1aca8bb1ca0b\") " pod="kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j" Apr 24 22:47:15.343261 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.343203 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-cabundle-cert\") pod \"isvc-secondary-51530a-predictor-687574b7c4-tjj9j\" (UID: \"5cabed56-cc14-4772-b75b-1aca8bb1ca0b\") " pod="kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j" Apr 24 22:47:15.343261 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.343232 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-51530a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-isvc-secondary-51530a-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-51530a-predictor-687574b7c4-tjj9j\" (UID: \"5cabed56-cc14-4772-b75b-1aca8bb1ca0b\") " pod="kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j" Apr 24 22:47:15.343365 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.343265 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-kserve-provision-location\") pod \"isvc-secondary-51530a-predictor-687574b7c4-tjj9j\" (UID: \"5cabed56-cc14-4772-b75b-1aca8bb1ca0b\") " pod="kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j" Apr 24 22:47:15.443965 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.443939 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-std28\" (UniqueName: \"kubernetes.io/projected/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-kube-api-access-std28\") pod \"isvc-secondary-51530a-predictor-687574b7c4-tjj9j\" (UID: \"5cabed56-cc14-4772-b75b-1aca8bb1ca0b\") " pod="kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j" Apr 24 22:47:15.444103 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.443989 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-proxy-tls\") pod \"isvc-secondary-51530a-predictor-687574b7c4-tjj9j\" (UID: \"5cabed56-cc14-4772-b75b-1aca8bb1ca0b\") " pod="kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j" Apr 24 22:47:15.444329 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.444108 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-cabundle-cert\") pod \"isvc-secondary-51530a-predictor-687574b7c4-tjj9j\" (UID: \"5cabed56-cc14-4772-b75b-1aca8bb1ca0b\") " pod="kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j" Apr 24 22:47:15.444329 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.444134 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-51530a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-isvc-secondary-51530a-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-51530a-predictor-687574b7c4-tjj9j\" (UID: \"5cabed56-cc14-4772-b75b-1aca8bb1ca0b\") " pod="kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j" Apr 24 22:47:15.444329 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.444165 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-kserve-provision-location\") pod \"isvc-secondary-51530a-predictor-687574b7c4-tjj9j\" (UID: \"5cabed56-cc14-4772-b75b-1aca8bb1ca0b\") " pod="kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j" Apr 24 22:47:15.444573 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.444548 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-kserve-provision-location\") pod \"isvc-secondary-51530a-predictor-687574b7c4-tjj9j\" (UID: \"5cabed56-cc14-4772-b75b-1aca8bb1ca0b\") " pod="kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j" Apr 24 22:47:15.444844 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.444825 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-51530a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-isvc-secondary-51530a-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-51530a-predictor-687574b7c4-tjj9j\" (UID: \"5cabed56-cc14-4772-b75b-1aca8bb1ca0b\") " pod="kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j" Apr 24 22:47:15.444893 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.444844 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-cabundle-cert\") pod \"isvc-secondary-51530a-predictor-687574b7c4-tjj9j\" (UID: \"5cabed56-cc14-4772-b75b-1aca8bb1ca0b\") " pod="kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j" Apr 24 22:47:15.446436 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.446417 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-proxy-tls\") pod \"isvc-secondary-51530a-predictor-687574b7c4-tjj9j\" (UID: \"5cabed56-cc14-4772-b75b-1aca8bb1ca0b\") " pod="kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j" Apr 24 22:47:15.453451 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.453430 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-std28\" (UniqueName: \"kubernetes.io/projected/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-kube-api-access-std28\") pod \"isvc-secondary-51530a-predictor-687574b7c4-tjj9j\" (UID: \"5cabed56-cc14-4772-b75b-1aca8bb1ca0b\") " pod="kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j" Apr 24 22:47:15.577876 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.577823 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j" Apr 24 22:47:15.701157 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.701130 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j"] Apr 24 22:47:15.704083 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:47:15.704056 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cabed56_cc14_4772_b75b_1aca8bb1ca0b.slice/crio-fe53b0b65b2ce2f5a9a5d37ac76dd9af56e6b49163a7856e4ac502cae9836af5 WatchSource:0}: Error finding container fe53b0b65b2ce2f5a9a5d37ac76dd9af56e6b49163a7856e4ac502cae9836af5: Status 404 returned error can't find the container with id fe53b0b65b2ce2f5a9a5d37ac76dd9af56e6b49163a7856e4ac502cae9836af5 Apr 24 22:47:15.705924 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.705906 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:47:15.860219 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.860151 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j" event={"ID":"5cabed56-cc14-4772-b75b-1aca8bb1ca0b","Type":"ContainerStarted","Data":"9bfb9995d9380a86e0d0e4082ca8543da1ece2a7b3bb3d64f06bbf24d0837183"} Apr 24 22:47:15.860219 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:15.860186 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j" event={"ID":"5cabed56-cc14-4772-b75b-1aca8bb1ca0b","Type":"ContainerStarted","Data":"fe53b0b65b2ce2f5a9a5d37ac76dd9af56e6b49163a7856e4ac502cae9836af5"} Apr 24 22:47:18.871381 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:18.871352 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-51530a-predictor-687574b7c4-tjj9j_5cabed56-cc14-4772-b75b-1aca8bb1ca0b/storage-initializer/0.log" Apr 24 22:47:18.871738 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:18.871395 2572 generic.go:358] "Generic (PLEG): container finished" podID="5cabed56-cc14-4772-b75b-1aca8bb1ca0b" containerID="9bfb9995d9380a86e0d0e4082ca8543da1ece2a7b3bb3d64f06bbf24d0837183" exitCode=1 Apr 24 22:47:18.871738 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:18.871460 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j" event={"ID":"5cabed56-cc14-4772-b75b-1aca8bb1ca0b","Type":"ContainerDied","Data":"9bfb9995d9380a86e0d0e4082ca8543da1ece2a7b3bb3d64f06bbf24d0837183"} Apr 24 22:47:19.876426 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:19.876391 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-51530a-predictor-687574b7c4-tjj9j_5cabed56-cc14-4772-b75b-1aca8bb1ca0b/storage-initializer/0.log" Apr 24 22:47:19.876816 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:19.876519 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j" event={"ID":"5cabed56-cc14-4772-b75b-1aca8bb1ca0b","Type":"ContainerStarted","Data":"341b3aa9c86b88f57a481b9a6893b03a24f217d77c1216c86d61bead91488b27"} Apr 24 22:47:22.893069 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:22.893046 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-51530a-predictor-687574b7c4-tjj9j_5cabed56-cc14-4772-b75b-1aca8bb1ca0b/storage-initializer/1.log" Apr 24 22:47:22.893453 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:22.893435 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-51530a-predictor-687574b7c4-tjj9j_5cabed56-cc14-4772-b75b-1aca8bb1ca0b/storage-initializer/0.log" Apr 24 22:47:22.893507 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:22.893476 2572 generic.go:358] "Generic (PLEG): container finished" podID="5cabed56-cc14-4772-b75b-1aca8bb1ca0b" containerID="341b3aa9c86b88f57a481b9a6893b03a24f217d77c1216c86d61bead91488b27" exitCode=1 Apr 24 22:47:22.893546 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:22.893529 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j" event={"ID":"5cabed56-cc14-4772-b75b-1aca8bb1ca0b","Type":"ContainerDied","Data":"341b3aa9c86b88f57a481b9a6893b03a24f217d77c1216c86d61bead91488b27"} Apr 24 22:47:22.893579 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:22.893568 2572 scope.go:117] "RemoveContainer" containerID="9bfb9995d9380a86e0d0e4082ca8543da1ece2a7b3bb3d64f06bbf24d0837183" Apr 24 22:47:22.894176 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:22.894158 2572 scope.go:117] "RemoveContainer" containerID="9bfb9995d9380a86e0d0e4082ca8543da1ece2a7b3bb3d64f06bbf24d0837183" Apr 24 22:47:22.903735 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:47:22.903708 2572 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-51530a-predictor-687574b7c4-tjj9j_kserve-ci-e2e-test_5cabed56-cc14-4772-b75b-1aca8bb1ca0b_0 in pod sandbox fe53b0b65b2ce2f5a9a5d37ac76dd9af56e6b49163a7856e4ac502cae9836af5 from index: no such id: '9bfb9995d9380a86e0d0e4082ca8543da1ece2a7b3bb3d64f06bbf24d0837183'" containerID="9bfb9995d9380a86e0d0e4082ca8543da1ece2a7b3bb3d64f06bbf24d0837183" Apr 24 22:47:22.903791 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:22.903745 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bfb9995d9380a86e0d0e4082ca8543da1ece2a7b3bb3d64f06bbf24d0837183"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-51530a-predictor-687574b7c4-tjj9j_kserve-ci-e2e-test_5cabed56-cc14-4772-b75b-1aca8bb1ca0b_0 in pod sandbox fe53b0b65b2ce2f5a9a5d37ac76dd9af56e6b49163a7856e4ac502cae9836af5 from index: no such id: '9bfb9995d9380a86e0d0e4082ca8543da1ece2a7b3bb3d64f06bbf24d0837183'" Apr 24 22:47:22.903864 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:47:22.903847 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-51530a-predictor-687574b7c4-tjj9j_kserve-ci-e2e-test(5cabed56-cc14-4772-b75b-1aca8bb1ca0b)\"" pod="kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j" podUID="5cabed56-cc14-4772-b75b-1aca8bb1ca0b" Apr 24 22:47:23.897407 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:23.897383 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-51530a-predictor-687574b7c4-tjj9j_5cabed56-cc14-4772-b75b-1aca8bb1ca0b/storage-initializer/1.log" Apr 24 22:47:29.335270 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.335226 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j"] Apr 24 22:47:29.406725 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.406695 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh"] Apr 24 22:47:29.407248 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.407072 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" podUID="ce65cd98-c037-43cf-8e8d-4a5027764ab6" containerName="kserve-container" containerID="cri-o://1f12ed1fd79574516745a45806e5a418d4278ac0603d3480dd1eabe1246e5215" gracePeriod=30 Apr 24 22:47:29.407248 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.407189 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" podUID="ce65cd98-c037-43cf-8e8d-4a5027764ab6" containerName="kube-rbac-proxy" containerID="cri-o://28e4ca490b3d57407872554ec413b9a99c35cfad7c6f2bbccc01ba107a5c610c" gracePeriod=30 Apr 24 22:47:29.443412 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.443385 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml"] Apr 24 22:47:29.448306 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.448287 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml" Apr 24 22:47:29.450999 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.450979 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-bd8d89-predictor-serving-cert\"" Apr 24 22:47:29.451133 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.451025 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-bd8d89\"" Apr 24 22:47:29.451133 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.451090 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-bd8d89-kube-rbac-proxy-sar-config\"" Apr 24 22:47:29.451133 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.451101 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-bd8d89-dockercfg-hgksr\"" Apr 24 22:47:29.456579 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.456560 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml"] Apr 24 22:47:29.526637 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.526620 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-51530a-predictor-687574b7c4-tjj9j_5cabed56-cc14-4772-b75b-1aca8bb1ca0b/storage-initializer/1.log" Apr 24 22:47:29.526717 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.526670 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j" Apr 24 22:47:29.547504 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.547485 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/2af1bab9-0468-42de-94e7-e6081a642248-cabundle-cert\") pod \"isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml\" (UID: \"2af1bab9-0468-42de-94e7-e6081a642248\") " pod="kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml" Apr 24 22:47:29.547615 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.547530 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6vg8\" (UniqueName: \"kubernetes.io/projected/2af1bab9-0468-42de-94e7-e6081a642248-kube-api-access-p6vg8\") pod \"isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml\" (UID: \"2af1bab9-0468-42de-94e7-e6081a642248\") " pod="kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml" Apr 24 22:47:29.547615 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.547604 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2af1bab9-0468-42de-94e7-e6081a642248-kserve-provision-location\") pod \"isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml\" (UID: \"2af1bab9-0468-42de-94e7-e6081a642248\") " pod="kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml" Apr 24 22:47:29.547731 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.547685 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2af1bab9-0468-42de-94e7-e6081a642248-proxy-tls\") pod \"isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml\" (UID: \"2af1bab9-0468-42de-94e7-e6081a642248\") " pod="kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml" Apr 24 22:47:29.547856 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.547834 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-bd8d89-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2af1bab9-0468-42de-94e7-e6081a642248-isvc-init-fail-bd8d89-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml\" (UID: \"2af1bab9-0468-42de-94e7-e6081a642248\") " pod="kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml" Apr 24 22:47:29.649123 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.649059 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-proxy-tls\") pod \"5cabed56-cc14-4772-b75b-1aca8bb1ca0b\" (UID: \"5cabed56-cc14-4772-b75b-1aca8bb1ca0b\") " Apr 24 22:47:29.649231 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.649125 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-51530a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-isvc-secondary-51530a-kube-rbac-proxy-sar-config\") pod \"5cabed56-cc14-4772-b75b-1aca8bb1ca0b\" (UID: \"5cabed56-cc14-4772-b75b-1aca8bb1ca0b\") " Apr 24 22:47:29.649231 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.649182 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-cabundle-cert\") pod \"5cabed56-cc14-4772-b75b-1aca8bb1ca0b\" (UID: \"5cabed56-cc14-4772-b75b-1aca8bb1ca0b\") " Apr 24 22:47:29.649231 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.649217 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-kserve-provision-location\") pod \"5cabed56-cc14-4772-b75b-1aca8bb1ca0b\" (UID: \"5cabed56-cc14-4772-b75b-1aca8bb1ca0b\") " Apr 24 22:47:29.649379 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.649247 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-std28\" (UniqueName: \"kubernetes.io/projected/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-kube-api-access-std28\") pod \"5cabed56-cc14-4772-b75b-1aca8bb1ca0b\" (UID: \"5cabed56-cc14-4772-b75b-1aca8bb1ca0b\") " Apr 24 22:47:29.649435 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.649400 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/2af1bab9-0468-42de-94e7-e6081a642248-cabundle-cert\") pod \"isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml\" (UID: \"2af1bab9-0468-42de-94e7-e6081a642248\") " pod="kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml" Apr 24 22:47:29.649489 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.649474 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6vg8\" (UniqueName: \"kubernetes.io/projected/2af1bab9-0468-42de-94e7-e6081a642248-kube-api-access-p6vg8\") pod \"isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml\" (UID: \"2af1bab9-0468-42de-94e7-e6081a642248\") " pod="kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml" Apr 24 22:47:29.649541 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.649479 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5cabed56-cc14-4772-b75b-1aca8bb1ca0b" (UID: "5cabed56-cc14-4772-b75b-1aca8bb1ca0b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:47:29.649541 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.649522 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-isvc-secondary-51530a-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-51530a-kube-rbac-proxy-sar-config") pod "5cabed56-cc14-4772-b75b-1aca8bb1ca0b" (UID: "5cabed56-cc14-4772-b75b-1aca8bb1ca0b"). InnerVolumeSpecName "isvc-secondary-51530a-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:47:29.649643 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.649539 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2af1bab9-0468-42de-94e7-e6081a642248-kserve-provision-location\") pod \"isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml\" (UID: \"2af1bab9-0468-42de-94e7-e6081a642248\") " pod="kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml" Apr 24 22:47:29.649643 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.649556 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "5cabed56-cc14-4772-b75b-1aca8bb1ca0b" (UID: "5cabed56-cc14-4772-b75b-1aca8bb1ca0b"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:47:29.649643 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.649580 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2af1bab9-0468-42de-94e7-e6081a642248-proxy-tls\") pod \"isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml\" (UID: \"2af1bab9-0468-42de-94e7-e6081a642248\") " pod="kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml" Apr 24 22:47:29.649786 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.649645 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-bd8d89-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2af1bab9-0468-42de-94e7-e6081a642248-isvc-init-fail-bd8d89-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml\" (UID: \"2af1bab9-0468-42de-94e7-e6081a642248\") " pod="kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml" Apr 24 22:47:29.649786 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.649699 2572 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-51530a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-isvc-secondary-51530a-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:47:29.649786 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.649716 2572 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-cabundle-cert\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:47:29.649786 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.649733 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-kserve-provision-location\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:47:29.649958 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.649856 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2af1bab9-0468-42de-94e7-e6081a642248-kserve-provision-location\") pod \"isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml\" (UID: \"2af1bab9-0468-42de-94e7-e6081a642248\") " pod="kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml" Apr 24 22:47:29.650169 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.650143 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/2af1bab9-0468-42de-94e7-e6081a642248-cabundle-cert\") pod \"isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml\" (UID: \"2af1bab9-0468-42de-94e7-e6081a642248\") " pod="kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml" Apr 24 22:47:29.650362 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.650340 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-bd8d89-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2af1bab9-0468-42de-94e7-e6081a642248-isvc-init-fail-bd8d89-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml\" (UID: \"2af1bab9-0468-42de-94e7-e6081a642248\") " pod="kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml" Apr 24 22:47:29.651722 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.651698 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5cabed56-cc14-4772-b75b-1aca8bb1ca0b" (UID: "5cabed56-cc14-4772-b75b-1aca8bb1ca0b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:47:29.651841 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.651822 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-kube-api-access-std28" (OuterVolumeSpecName: "kube-api-access-std28") pod "5cabed56-cc14-4772-b75b-1aca8bb1ca0b" (UID: "5cabed56-cc14-4772-b75b-1aca8bb1ca0b"). InnerVolumeSpecName "kube-api-access-std28". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:47:29.652352 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.652333 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2af1bab9-0468-42de-94e7-e6081a642248-proxy-tls\") pod \"isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml\" (UID: \"2af1bab9-0468-42de-94e7-e6081a642248\") " pod="kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml" Apr 24 22:47:29.658855 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.658832 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6vg8\" (UniqueName: \"kubernetes.io/projected/2af1bab9-0468-42de-94e7-e6081a642248-kube-api-access-p6vg8\") pod \"isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml\" (UID: \"2af1bab9-0468-42de-94e7-e6081a642248\") " pod="kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml" Apr 24 22:47:29.751154 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.751123 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-proxy-tls\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:47:29.751154 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.751148 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-std28\" (UniqueName: \"kubernetes.io/projected/5cabed56-cc14-4772-b75b-1aca8bb1ca0b-kube-api-access-std28\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:47:29.761084 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.761062 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml" Apr 24 22:47:29.878170 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.878143 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml"] Apr 24 22:47:29.879946 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:47:29.879918 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2af1bab9_0468_42de_94e7_e6081a642248.slice/crio-6da0f246480c665a5f16298f83fea3d2a7ba1ef72d7af7d66d0fea1d099500ff WatchSource:0}: Error finding container 6da0f246480c665a5f16298f83fea3d2a7ba1ef72d7af7d66d0fea1d099500ff: Status 404 returned error can't find the container with id 6da0f246480c665a5f16298f83fea3d2a7ba1ef72d7af7d66d0fea1d099500ff Apr 24 22:47:29.918949 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.918927 2572 generic.go:358] "Generic (PLEG): container finished" podID="ce65cd98-c037-43cf-8e8d-4a5027764ab6" containerID="28e4ca490b3d57407872554ec413b9a99c35cfad7c6f2bbccc01ba107a5c610c" exitCode=2 Apr 24 22:47:29.919059 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.918994 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" event={"ID":"ce65cd98-c037-43cf-8e8d-4a5027764ab6","Type":"ContainerDied","Data":"28e4ca490b3d57407872554ec413b9a99c35cfad7c6f2bbccc01ba107a5c610c"} Apr 24 22:47:29.920070 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.920050 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml" event={"ID":"2af1bab9-0468-42de-94e7-e6081a642248","Type":"ContainerStarted","Data":"6da0f246480c665a5f16298f83fea3d2a7ba1ef72d7af7d66d0fea1d099500ff"} Apr 24 22:47:29.921083 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.921067 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-51530a-predictor-687574b7c4-tjj9j_5cabed56-cc14-4772-b75b-1aca8bb1ca0b/storage-initializer/1.log" Apr 24 22:47:29.921159 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.921111 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j" event={"ID":"5cabed56-cc14-4772-b75b-1aca8bb1ca0b","Type":"ContainerDied","Data":"fe53b0b65b2ce2f5a9a5d37ac76dd9af56e6b49163a7856e4ac502cae9836af5"} Apr 24 22:47:29.921159 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.921131 2572 scope.go:117] "RemoveContainer" containerID="341b3aa9c86b88f57a481b9a6893b03a24f217d77c1216c86d61bead91488b27" Apr 24 22:47:29.921246 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.921175 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j" Apr 24 22:47:29.961700 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.961675 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j"] Apr 24 22:47:29.965866 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:29.965847 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-51530a-predictor-687574b7c4-tjj9j"] Apr 24 22:47:30.926196 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:30.926163 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml" event={"ID":"2af1bab9-0468-42de-94e7-e6081a642248","Type":"ContainerStarted","Data":"93b9b7c54d19b11b1e6fba226be7d21492705fc80496342ab4d729f729acbe45"} Apr 24 22:47:31.417899 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:31.417865 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cabed56-cc14-4772-b75b-1aca8bb1ca0b" path="/var/lib/kubelet/pods/5cabed56-cc14-4772-b75b-1aca8bb1ca0b/volumes" Apr 24 22:47:32.643115 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:32.643076 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" podUID="ce65cd98-c037-43cf-8e8d-4a5027764ab6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.33:8643/healthz\": dial tcp 10.134.0.33:8643: connect: connection refused" Apr 24 22:47:33.146627 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.146605 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" Apr 24 22:47:33.280645 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.280626 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-51530a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce65cd98-c037-43cf-8e8d-4a5027764ab6-isvc-primary-51530a-kube-rbac-proxy-sar-config\") pod \"ce65cd98-c037-43cf-8e8d-4a5027764ab6\" (UID: \"ce65cd98-c037-43cf-8e8d-4a5027764ab6\") " Apr 24 22:47:33.280788 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.280658 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce65cd98-c037-43cf-8e8d-4a5027764ab6-proxy-tls\") pod \"ce65cd98-c037-43cf-8e8d-4a5027764ab6\" (UID: \"ce65cd98-c037-43cf-8e8d-4a5027764ab6\") " Apr 24 22:47:33.280788 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.280682 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk5cm\" (UniqueName: \"kubernetes.io/projected/ce65cd98-c037-43cf-8e8d-4a5027764ab6-kube-api-access-pk5cm\") pod \"ce65cd98-c037-43cf-8e8d-4a5027764ab6\" (UID: \"ce65cd98-c037-43cf-8e8d-4a5027764ab6\") " Apr 24 22:47:33.280913 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.280821 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce65cd98-c037-43cf-8e8d-4a5027764ab6-kserve-provision-location\") pod \"ce65cd98-c037-43cf-8e8d-4a5027764ab6\" (UID: \"ce65cd98-c037-43cf-8e8d-4a5027764ab6\") " Apr 24 22:47:33.281109 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.281084 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce65cd98-c037-43cf-8e8d-4a5027764ab6-isvc-primary-51530a-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-51530a-kube-rbac-proxy-sar-config") pod "ce65cd98-c037-43cf-8e8d-4a5027764ab6" (UID: "ce65cd98-c037-43cf-8e8d-4a5027764ab6"). InnerVolumeSpecName "isvc-primary-51530a-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:47:33.281190 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.281139 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce65cd98-c037-43cf-8e8d-4a5027764ab6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ce65cd98-c037-43cf-8e8d-4a5027764ab6" (UID: "ce65cd98-c037-43cf-8e8d-4a5027764ab6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:47:33.282647 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.282625 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce65cd98-c037-43cf-8e8d-4a5027764ab6-kube-api-access-pk5cm" (OuterVolumeSpecName: "kube-api-access-pk5cm") pod "ce65cd98-c037-43cf-8e8d-4a5027764ab6" (UID: "ce65cd98-c037-43cf-8e8d-4a5027764ab6"). InnerVolumeSpecName "kube-api-access-pk5cm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:47:33.282732 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.282705 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce65cd98-c037-43cf-8e8d-4a5027764ab6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ce65cd98-c037-43cf-8e8d-4a5027764ab6" (UID: "ce65cd98-c037-43cf-8e8d-4a5027764ab6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:47:33.381450 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.381429 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pk5cm\" (UniqueName: \"kubernetes.io/projected/ce65cd98-c037-43cf-8e8d-4a5027764ab6-kube-api-access-pk5cm\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:47:33.381450 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.381451 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ce65cd98-c037-43cf-8e8d-4a5027764ab6-kserve-provision-location\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:47:33.381598 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.381463 2572 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-51530a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ce65cd98-c037-43cf-8e8d-4a5027764ab6-isvc-primary-51530a-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:47:33.381598 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.381472 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ce65cd98-c037-43cf-8e8d-4a5027764ab6-proxy-tls\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:47:33.937464 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.937431 2572 generic.go:358] "Generic (PLEG): container finished" podID="ce65cd98-c037-43cf-8e8d-4a5027764ab6" containerID="1f12ed1fd79574516745a45806e5a418d4278ac0603d3480dd1eabe1246e5215" exitCode=0 Apr 24 22:47:33.937861 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.937517 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" event={"ID":"ce65cd98-c037-43cf-8e8d-4a5027764ab6","Type":"ContainerDied","Data":"1f12ed1fd79574516745a45806e5a418d4278ac0603d3480dd1eabe1246e5215"} Apr 24 22:47:33.937861 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.937540 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" Apr 24 22:47:33.937861 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.937586 2572 scope.go:117] "RemoveContainer" containerID="28e4ca490b3d57407872554ec413b9a99c35cfad7c6f2bbccc01ba107a5c610c" Apr 24 22:47:33.937861 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.937571 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh" event={"ID":"ce65cd98-c037-43cf-8e8d-4a5027764ab6","Type":"ContainerDied","Data":"8afec3f7e58f496171ed0b4b272b534f888e9d758526deb93e47c23573323d8b"} Apr 24 22:47:33.939416 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.939395 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml_2af1bab9-0468-42de-94e7-e6081a642248/storage-initializer/0.log" Apr 24 22:47:33.939538 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.939434 2572 generic.go:358] "Generic (PLEG): container finished" podID="2af1bab9-0468-42de-94e7-e6081a642248" containerID="93b9b7c54d19b11b1e6fba226be7d21492705fc80496342ab4d729f729acbe45" exitCode=1 Apr 24 22:47:33.939538 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.939520 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml" event={"ID":"2af1bab9-0468-42de-94e7-e6081a642248","Type":"ContainerDied","Data":"93b9b7c54d19b11b1e6fba226be7d21492705fc80496342ab4d729f729acbe45"} Apr 24 22:47:33.955885 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.955827 2572 scope.go:117] "RemoveContainer" containerID="1f12ed1fd79574516745a45806e5a418d4278ac0603d3480dd1eabe1246e5215" Apr 24 22:47:33.967512 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.967479 2572 scope.go:117] "RemoveContainer" containerID="e2e403400a815fc1a1b889da60dda6fd0f23bbda5e661a98aad41767c4a39556" Apr 24 22:47:33.979209 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.979181 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh"] Apr 24 22:47:33.982890 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.982809 2572 scope.go:117] "RemoveContainer" containerID="28e4ca490b3d57407872554ec413b9a99c35cfad7c6f2bbccc01ba107a5c610c" Apr 24 22:47:33.983145 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:47:33.983115 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28e4ca490b3d57407872554ec413b9a99c35cfad7c6f2bbccc01ba107a5c610c\": container with ID starting with 28e4ca490b3d57407872554ec413b9a99c35cfad7c6f2bbccc01ba107a5c610c not found: ID does not exist" containerID="28e4ca490b3d57407872554ec413b9a99c35cfad7c6f2bbccc01ba107a5c610c" Apr 24 22:47:33.983259 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.983153 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28e4ca490b3d57407872554ec413b9a99c35cfad7c6f2bbccc01ba107a5c610c"} err="failed to get container status \"28e4ca490b3d57407872554ec413b9a99c35cfad7c6f2bbccc01ba107a5c610c\": rpc error: code = NotFound desc = could not find container \"28e4ca490b3d57407872554ec413b9a99c35cfad7c6f2bbccc01ba107a5c610c\": container with ID starting with 28e4ca490b3d57407872554ec413b9a99c35cfad7c6f2bbccc01ba107a5c610c not found: ID does not exist" Apr 24 22:47:33.983259 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.983170 2572 scope.go:117] "RemoveContainer" containerID="1f12ed1fd79574516745a45806e5a418d4278ac0603d3480dd1eabe1246e5215" Apr 24 22:47:33.983514 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:47:33.983481 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f12ed1fd79574516745a45806e5a418d4278ac0603d3480dd1eabe1246e5215\": container with ID starting with 1f12ed1fd79574516745a45806e5a418d4278ac0603d3480dd1eabe1246e5215 not found: ID does not exist" containerID="1f12ed1fd79574516745a45806e5a418d4278ac0603d3480dd1eabe1246e5215" Apr 24 22:47:33.983514 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.983507 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f12ed1fd79574516745a45806e5a418d4278ac0603d3480dd1eabe1246e5215"} err="failed to get container status \"1f12ed1fd79574516745a45806e5a418d4278ac0603d3480dd1eabe1246e5215\": rpc error: code = NotFound desc = could not find container \"1f12ed1fd79574516745a45806e5a418d4278ac0603d3480dd1eabe1246e5215\": container with ID starting with 1f12ed1fd79574516745a45806e5a418d4278ac0603d3480dd1eabe1246e5215 not found: ID does not exist" Apr 24 22:47:33.983645 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.983524 2572 scope.go:117] "RemoveContainer" containerID="e2e403400a815fc1a1b889da60dda6fd0f23bbda5e661a98aad41767c4a39556" Apr 24 22:47:33.983645 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.983611 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-51530a-predictor-77f766b669-jfmsh"] Apr 24 22:47:33.983781 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:47:33.983762 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2e403400a815fc1a1b889da60dda6fd0f23bbda5e661a98aad41767c4a39556\": container with ID starting with e2e403400a815fc1a1b889da60dda6fd0f23bbda5e661a98aad41767c4a39556 not found: ID does not exist" containerID="e2e403400a815fc1a1b889da60dda6fd0f23bbda5e661a98aad41767c4a39556" Apr 24 22:47:33.983841 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:33.983792 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2e403400a815fc1a1b889da60dda6fd0f23bbda5e661a98aad41767c4a39556"} err="failed to get container status \"e2e403400a815fc1a1b889da60dda6fd0f23bbda5e661a98aad41767c4a39556\": rpc error: code = NotFound desc = could not find container \"e2e403400a815fc1a1b889da60dda6fd0f23bbda5e661a98aad41767c4a39556\": container with ID starting with e2e403400a815fc1a1b889da60dda6fd0f23bbda5e661a98aad41767c4a39556 not found: ID does not exist" Apr 24 22:47:34.445002 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.444973 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml"] Apr 24 22:47:34.586549 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.586519 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l"] Apr 24 22:47:34.586850 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.586839 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5cabed56-cc14-4772-b75b-1aca8bb1ca0b" containerName="storage-initializer" Apr 24 22:47:34.586891 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.586852 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cabed56-cc14-4772-b75b-1aca8bb1ca0b" containerName="storage-initializer" Apr 24 22:47:34.586891 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.586863 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce65cd98-c037-43cf-8e8d-4a5027764ab6" containerName="kserve-container" Apr 24 22:47:34.586891 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.586869 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce65cd98-c037-43cf-8e8d-4a5027764ab6" containerName="kserve-container" Apr 24 22:47:34.586891 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.586874 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce65cd98-c037-43cf-8e8d-4a5027764ab6" containerName="kube-rbac-proxy" Apr 24 22:47:34.586891 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.586880 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce65cd98-c037-43cf-8e8d-4a5027764ab6" containerName="kube-rbac-proxy" Apr 24 22:47:34.586891 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.586891 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5cabed56-cc14-4772-b75b-1aca8bb1ca0b" containerName="storage-initializer" Apr 24 22:47:34.587108 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.586896 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cabed56-cc14-4772-b75b-1aca8bb1ca0b" containerName="storage-initializer" Apr 24 22:47:34.587108 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.586912 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce65cd98-c037-43cf-8e8d-4a5027764ab6" containerName="storage-initializer" Apr 24 22:47:34.587108 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.586917 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce65cd98-c037-43cf-8e8d-4a5027764ab6" containerName="storage-initializer" Apr 24 22:47:34.587108 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.586962 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="5cabed56-cc14-4772-b75b-1aca8bb1ca0b" containerName="storage-initializer" Apr 24 22:47:34.587108 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.586972 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="5cabed56-cc14-4772-b75b-1aca8bb1ca0b" containerName="storage-initializer" Apr 24 22:47:34.587108 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.586979 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce65cd98-c037-43cf-8e8d-4a5027764ab6" containerName="kube-rbac-proxy" Apr 24 22:47:34.587108 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.586986 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce65cd98-c037-43cf-8e8d-4a5027764ab6" containerName="kserve-container" Apr 24 22:47:34.590232 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.590216 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" Apr 24 22:47:34.593276 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.593249 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-b6db3-predictor-serving-cert\"" Apr 24 22:47:34.593404 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.593374 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-gtqzv\"" Apr 24 22:47:34.593559 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.593546 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-b6db3-kube-rbac-proxy-sar-config\"" Apr 24 22:47:34.601106 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.601086 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l"] Apr 24 22:47:34.692673 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.692646 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/595d3465-735f-451b-8411-9913eb253a29-proxy-tls\") pod \"raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l\" (UID: \"595d3465-735f-451b-8411-9913eb253a29\") " pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" Apr 24 22:47:34.692788 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.692677 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-b6db3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/595d3465-735f-451b-8411-9913eb253a29-raw-sklearn-b6db3-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l\" (UID: \"595d3465-735f-451b-8411-9913eb253a29\") " pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" Apr 24 22:47:34.692788 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.692723 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/595d3465-735f-451b-8411-9913eb253a29-kserve-provision-location\") pod \"raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l\" (UID: \"595d3465-735f-451b-8411-9913eb253a29\") " pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" Apr 24 22:47:34.692788 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.692763 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5v4q\" (UniqueName: \"kubernetes.io/projected/595d3465-735f-451b-8411-9913eb253a29-kube-api-access-m5v4q\") pod \"raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l\" (UID: \"595d3465-735f-451b-8411-9913eb253a29\") " pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" Apr 24 22:47:34.793734 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.793703 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5v4q\" (UniqueName: \"kubernetes.io/projected/595d3465-735f-451b-8411-9913eb253a29-kube-api-access-m5v4q\") pod \"raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l\" (UID: \"595d3465-735f-451b-8411-9913eb253a29\") " pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" Apr 24 22:47:34.793844 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.793766 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/595d3465-735f-451b-8411-9913eb253a29-proxy-tls\") pod \"raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l\" (UID: \"595d3465-735f-451b-8411-9913eb253a29\") " pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" Apr 24 22:47:34.793844 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.793793 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-b6db3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/595d3465-735f-451b-8411-9913eb253a29-raw-sklearn-b6db3-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l\" (UID: \"595d3465-735f-451b-8411-9913eb253a29\") " pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" Apr 24 22:47:34.793844 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.793828 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/595d3465-735f-451b-8411-9913eb253a29-kserve-provision-location\") pod \"raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l\" (UID: \"595d3465-735f-451b-8411-9913eb253a29\") " pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" Apr 24 22:47:34.794090 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:47:34.793936 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-serving-cert: secret "raw-sklearn-b6db3-predictor-serving-cert" not found Apr 24 22:47:34.794090 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:47:34.794001 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/595d3465-735f-451b-8411-9913eb253a29-proxy-tls podName:595d3465-735f-451b-8411-9913eb253a29 nodeName:}" failed. No retries permitted until 2026-04-24 22:47:35.293980606 +0000 UTC m=+1062.442286132 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/595d3465-735f-451b-8411-9913eb253a29-proxy-tls") pod "raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" (UID: "595d3465-735f-451b-8411-9913eb253a29") : secret "raw-sklearn-b6db3-predictor-serving-cert" not found Apr 24 22:47:34.794257 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.794235 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/595d3465-735f-451b-8411-9913eb253a29-kserve-provision-location\") pod \"raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l\" (UID: \"595d3465-735f-451b-8411-9913eb253a29\") " pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" Apr 24 22:47:34.794497 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.794480 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-b6db3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/595d3465-735f-451b-8411-9913eb253a29-raw-sklearn-b6db3-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l\" (UID: \"595d3465-735f-451b-8411-9913eb253a29\") " pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" Apr 24 22:47:34.804866 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.804841 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5v4q\" (UniqueName: \"kubernetes.io/projected/595d3465-735f-451b-8411-9913eb253a29-kube-api-access-m5v4q\") pod \"raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l\" (UID: \"595d3465-735f-451b-8411-9913eb253a29\") " pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" Apr 24 22:47:34.943576 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.943559 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml_2af1bab9-0468-42de-94e7-e6081a642248/storage-initializer/0.log" Apr 24 22:47:34.943896 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.943675 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml" event={"ID":"2af1bab9-0468-42de-94e7-e6081a642248","Type":"ContainerStarted","Data":"e7b142cd1237501a3c18d319330ee21ac08e60bc10af296d098698ce11c94c70"} Apr 24 22:47:34.943896 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:34.943776 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml" podUID="2af1bab9-0468-42de-94e7-e6081a642248" containerName="storage-initializer" containerID="cri-o://e7b142cd1237501a3c18d319330ee21ac08e60bc10af296d098698ce11c94c70" gracePeriod=30 Apr 24 22:47:35.297456 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:35.297430 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/595d3465-735f-451b-8411-9913eb253a29-proxy-tls\") pod \"raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l\" (UID: \"595d3465-735f-451b-8411-9913eb253a29\") " pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" Apr 24 22:47:35.299529 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:35.299512 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/595d3465-735f-451b-8411-9913eb253a29-proxy-tls\") pod \"raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l\" (UID: \"595d3465-735f-451b-8411-9913eb253a29\") " pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" Apr 24 22:47:35.417435 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:35.417410 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce65cd98-c037-43cf-8e8d-4a5027764ab6" path="/var/lib/kubelet/pods/ce65cd98-c037-43cf-8e8d-4a5027764ab6/volumes" Apr 24 22:47:35.500906 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:35.500875 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" Apr 24 22:47:35.615503 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:35.615481 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l"] Apr 24 22:47:35.617265 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:47:35.617233 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod595d3465_735f_451b_8411_9913eb253a29.slice/crio-ed126c45c0051dc82b82588cf7ee7c9cfd6ca170e36643caae7e11cdf4f437dd WatchSource:0}: Error finding container ed126c45c0051dc82b82588cf7ee7c9cfd6ca170e36643caae7e11cdf4f437dd: Status 404 returned error can't find the container with id ed126c45c0051dc82b82588cf7ee7c9cfd6ca170e36643caae7e11cdf4f437dd Apr 24 22:47:35.949429 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:35.949358 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" event={"ID":"595d3465-735f-451b-8411-9913eb253a29","Type":"ContainerStarted","Data":"696ff6123011db41698133c90d57d4b0b8e711380b7efb1282e280e2ef5b3d04"} Apr 24 22:47:35.949429 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:35.949392 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" event={"ID":"595d3465-735f-451b-8411-9913eb253a29","Type":"ContainerStarted","Data":"ed126c45c0051dc82b82588cf7ee7c9cfd6ca170e36643caae7e11cdf4f437dd"} Apr 24 22:47:38.586763 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.586743 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml_2af1bab9-0468-42de-94e7-e6081a642248/storage-initializer/1.log" Apr 24 22:47:38.587152 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.587135 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml_2af1bab9-0468-42de-94e7-e6081a642248/storage-initializer/0.log" Apr 24 22:47:38.587228 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.587219 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml" Apr 24 22:47:38.626839 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.626816 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6vg8\" (UniqueName: \"kubernetes.io/projected/2af1bab9-0468-42de-94e7-e6081a642248-kube-api-access-p6vg8\") pod \"2af1bab9-0468-42de-94e7-e6081a642248\" (UID: \"2af1bab9-0468-42de-94e7-e6081a642248\") " Apr 24 22:47:38.626937 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.626862 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-bd8d89-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2af1bab9-0468-42de-94e7-e6081a642248-isvc-init-fail-bd8d89-kube-rbac-proxy-sar-config\") pod \"2af1bab9-0468-42de-94e7-e6081a642248\" (UID: \"2af1bab9-0468-42de-94e7-e6081a642248\") " Apr 24 22:47:38.626937 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.626906 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/2af1bab9-0468-42de-94e7-e6081a642248-cabundle-cert\") pod \"2af1bab9-0468-42de-94e7-e6081a642248\" (UID: \"2af1bab9-0468-42de-94e7-e6081a642248\") " Apr 24 22:47:38.627036 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.626964 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2af1bab9-0468-42de-94e7-e6081a642248-kserve-provision-location\") pod \"2af1bab9-0468-42de-94e7-e6081a642248\" (UID: \"2af1bab9-0468-42de-94e7-e6081a642248\") " Apr 24 22:47:38.627036 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.626989 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2af1bab9-0468-42de-94e7-e6081a642248-proxy-tls\") pod \"2af1bab9-0468-42de-94e7-e6081a642248\" (UID: \"2af1bab9-0468-42de-94e7-e6081a642248\") " Apr 24 22:47:38.627253 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.627226 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2af1bab9-0468-42de-94e7-e6081a642248-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2af1bab9-0468-42de-94e7-e6081a642248" (UID: "2af1bab9-0468-42de-94e7-e6081a642248"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:47:38.627335 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.627294 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af1bab9-0468-42de-94e7-e6081a642248-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "2af1bab9-0468-42de-94e7-e6081a642248" (UID: "2af1bab9-0468-42de-94e7-e6081a642248"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:47:38.627335 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.627302 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af1bab9-0468-42de-94e7-e6081a642248-isvc-init-fail-bd8d89-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-bd8d89-kube-rbac-proxy-sar-config") pod "2af1bab9-0468-42de-94e7-e6081a642248" (UID: "2af1bab9-0468-42de-94e7-e6081a642248"). InnerVolumeSpecName "isvc-init-fail-bd8d89-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:47:38.628793 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.628774 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af1bab9-0468-42de-94e7-e6081a642248-kube-api-access-p6vg8" (OuterVolumeSpecName: "kube-api-access-p6vg8") pod "2af1bab9-0468-42de-94e7-e6081a642248" (UID: "2af1bab9-0468-42de-94e7-e6081a642248"). InnerVolumeSpecName "kube-api-access-p6vg8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:47:38.628858 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.628835 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2af1bab9-0468-42de-94e7-e6081a642248-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2af1bab9-0468-42de-94e7-e6081a642248" (UID: "2af1bab9-0468-42de-94e7-e6081a642248"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:47:38.728214 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.728156 2572 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/2af1bab9-0468-42de-94e7-e6081a642248-cabundle-cert\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:47:38.728214 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.728179 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2af1bab9-0468-42de-94e7-e6081a642248-kserve-provision-location\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:47:38.728214 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.728190 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2af1bab9-0468-42de-94e7-e6081a642248-proxy-tls\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:47:38.728214 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.728199 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p6vg8\" (UniqueName: \"kubernetes.io/projected/2af1bab9-0468-42de-94e7-e6081a642248-kube-api-access-p6vg8\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:47:38.728214 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.728209 2572 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-bd8d89-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2af1bab9-0468-42de-94e7-e6081a642248-isvc-init-fail-bd8d89-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:47:38.958788 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.958767 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml_2af1bab9-0468-42de-94e7-e6081a642248/storage-initializer/1.log" Apr 24 22:47:38.959180 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.959159 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml_2af1bab9-0468-42de-94e7-e6081a642248/storage-initializer/0.log" Apr 24 22:47:38.959282 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.959194 2572 generic.go:358] "Generic (PLEG): container finished" podID="2af1bab9-0468-42de-94e7-e6081a642248" containerID="e7b142cd1237501a3c18d319330ee21ac08e60bc10af296d098698ce11c94c70" exitCode=1 Apr 24 22:47:38.959282 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.959229 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml" event={"ID":"2af1bab9-0468-42de-94e7-e6081a642248","Type":"ContainerDied","Data":"e7b142cd1237501a3c18d319330ee21ac08e60bc10af296d098698ce11c94c70"} Apr 24 22:47:38.959282 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.959252 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml" event={"ID":"2af1bab9-0468-42de-94e7-e6081a642248","Type":"ContainerDied","Data":"6da0f246480c665a5f16298f83fea3d2a7ba1ef72d7af7d66d0fea1d099500ff"} Apr 24 22:47:38.959282 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.959266 2572 scope.go:117] "RemoveContainer" containerID="e7b142cd1237501a3c18d319330ee21ac08e60bc10af296d098698ce11c94c70" Apr 24 22:47:38.959282 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.959277 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml" Apr 24 22:47:38.967626 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.967604 2572 scope.go:117] "RemoveContainer" containerID="93b9b7c54d19b11b1e6fba226be7d21492705fc80496342ab4d729f729acbe45" Apr 24 22:47:38.974693 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.974675 2572 scope.go:117] "RemoveContainer" containerID="e7b142cd1237501a3c18d319330ee21ac08e60bc10af296d098698ce11c94c70" Apr 24 22:47:38.974914 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:47:38.974898 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7b142cd1237501a3c18d319330ee21ac08e60bc10af296d098698ce11c94c70\": container with ID starting with e7b142cd1237501a3c18d319330ee21ac08e60bc10af296d098698ce11c94c70 not found: ID does not exist" containerID="e7b142cd1237501a3c18d319330ee21ac08e60bc10af296d098698ce11c94c70" Apr 24 22:47:38.974956 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.974922 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7b142cd1237501a3c18d319330ee21ac08e60bc10af296d098698ce11c94c70"} err="failed to get container status \"e7b142cd1237501a3c18d319330ee21ac08e60bc10af296d098698ce11c94c70\": rpc error: code = NotFound desc = could not find container \"e7b142cd1237501a3c18d319330ee21ac08e60bc10af296d098698ce11c94c70\": container with ID starting with e7b142cd1237501a3c18d319330ee21ac08e60bc10af296d098698ce11c94c70 not found: ID does not exist" Apr 24 22:47:38.974956 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.974938 2572 scope.go:117] "RemoveContainer" containerID="93b9b7c54d19b11b1e6fba226be7d21492705fc80496342ab4d729f729acbe45" Apr 24 22:47:38.975155 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:47:38.975139 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93b9b7c54d19b11b1e6fba226be7d21492705fc80496342ab4d729f729acbe45\": container with ID starting with 93b9b7c54d19b11b1e6fba226be7d21492705fc80496342ab4d729f729acbe45 not found: ID does not exist" containerID="93b9b7c54d19b11b1e6fba226be7d21492705fc80496342ab4d729f729acbe45" Apr 24 22:47:38.975194 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:38.975161 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93b9b7c54d19b11b1e6fba226be7d21492705fc80496342ab4d729f729acbe45"} err="failed to get container status \"93b9b7c54d19b11b1e6fba226be7d21492705fc80496342ab4d729f729acbe45\": rpc error: code = NotFound desc = could not find container \"93b9b7c54d19b11b1e6fba226be7d21492705fc80496342ab4d729f729acbe45\": container with ID starting with 93b9b7c54d19b11b1e6fba226be7d21492705fc80496342ab4d729f729acbe45 not found: ID does not exist" Apr 24 22:47:39.001328 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:39.001280 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml"] Apr 24 22:47:39.005596 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:39.005575 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-bd8d89-predictor-77996d68c6-lmmml"] Apr 24 22:47:39.417555 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:39.417488 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2af1bab9-0468-42de-94e7-e6081a642248" path="/var/lib/kubelet/pods/2af1bab9-0468-42de-94e7-e6081a642248/volumes" Apr 24 22:47:39.964936 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:39.964902 2572 generic.go:358] "Generic (PLEG): container finished" podID="595d3465-735f-451b-8411-9913eb253a29" containerID="696ff6123011db41698133c90d57d4b0b8e711380b7efb1282e280e2ef5b3d04" exitCode=0 Apr 24 22:47:39.965423 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:39.964970 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" event={"ID":"595d3465-735f-451b-8411-9913eb253a29","Type":"ContainerDied","Data":"696ff6123011db41698133c90d57d4b0b8e711380b7efb1282e280e2ef5b3d04"} Apr 24 22:47:40.970502 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:40.970465 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" event={"ID":"595d3465-735f-451b-8411-9913eb253a29","Type":"ContainerStarted","Data":"52a47b2ee82d6ffa86249379f52af907c5a476b80a23fb0ce05c94912f7178ba"} Apr 24 22:47:40.970502 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:40.970497 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" event={"ID":"595d3465-735f-451b-8411-9913eb253a29","Type":"ContainerStarted","Data":"7f53e002dc95c4be37ae50d3134ec6afd59ba1d1209f6f2f9cf8b9a709d1e2ab"} Apr 24 22:47:40.970899 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:40.970784 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" Apr 24 22:47:40.995416 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:40.995368 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" podStartSLOduration=6.995353723 podStartE2EDuration="6.995353723s" podCreationTimestamp="2026-04-24 22:47:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:47:40.994141339 +0000 UTC m=+1068.142446887" watchObservedRunningTime="2026-04-24 22:47:40.995353723 +0000 UTC m=+1068.143659273" Apr 24 22:47:41.974258 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:41.974226 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" Apr 24 22:47:41.975519 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:41.975487 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" podUID="595d3465-735f-451b-8411-9913eb253a29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 22:47:42.977091 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:42.977046 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" podUID="595d3465-735f-451b-8411-9913eb253a29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 22:47:47.981859 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:47.981826 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" Apr 24 22:47:47.982459 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:47.982426 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" podUID="595d3465-735f-451b-8411-9913eb253a29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 22:47:57.982406 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:47:57.982367 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" podUID="595d3465-735f-451b-8411-9913eb253a29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 22:48:07.982309 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:07.982271 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" podUID="595d3465-735f-451b-8411-9913eb253a29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 22:48:17.983141 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:17.983104 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" podUID="595d3465-735f-451b-8411-9913eb253a29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 22:48:27.982609 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:27.982573 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" podUID="595d3465-735f-451b-8411-9913eb253a29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 22:48:37.982402 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:37.982363 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" podUID="595d3465-735f-451b-8411-9913eb253a29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 22:48:47.983221 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:47.983183 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" Apr 24 22:48:54.646252 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:54.646217 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l"] Apr 24 22:48:54.646834 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:54.646622 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" podUID="595d3465-735f-451b-8411-9913eb253a29" containerName="kserve-container" containerID="cri-o://7f53e002dc95c4be37ae50d3134ec6afd59ba1d1209f6f2f9cf8b9a709d1e2ab" gracePeriod=30 Apr 24 22:48:54.646834 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:54.646649 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" podUID="595d3465-735f-451b-8411-9913eb253a29" containerName="kube-rbac-proxy" containerID="cri-o://52a47b2ee82d6ffa86249379f52af907c5a476b80a23fb0ce05c94912f7178ba" gracePeriod=30 Apr 24 22:48:54.735660 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:54.735633 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z"] Apr 24 22:48:54.735980 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:54.735967 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2af1bab9-0468-42de-94e7-e6081a642248" containerName="storage-initializer" Apr 24 22:48:54.736061 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:54.735981 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af1bab9-0468-42de-94e7-e6081a642248" containerName="storage-initializer" Apr 24 22:48:54.736061 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:54.735993 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2af1bab9-0468-42de-94e7-e6081a642248" containerName="storage-initializer" Apr 24 22:48:54.736061 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:54.735998 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af1bab9-0468-42de-94e7-e6081a642248" containerName="storage-initializer" Apr 24 22:48:54.736173 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:54.736064 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="2af1bab9-0468-42de-94e7-e6081a642248" containerName="storage-initializer" Apr 24 22:48:54.736173 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:54.736072 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="2af1bab9-0468-42de-94e7-e6081a642248" containerName="storage-initializer" Apr 24 22:48:54.739378 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:54.739361 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" Apr 24 22:48:54.741903 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:54.741879 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-4d001-predictor-serving-cert\"" Apr 24 22:48:54.742181 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:54.742156 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-4d001-kube-rbac-proxy-sar-config\"" Apr 24 22:48:54.750764 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:54.750739 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z"] Apr 24 22:48:54.859945 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:54.859919 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-runtime-4d001-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5af0f174-66f8-4ba9-a0fb-b05fed19fccf-raw-sklearn-runtime-4d001-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z\" (UID: \"5af0f174-66f8-4ba9-a0fb-b05fed19fccf\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" Apr 24 22:48:54.860069 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:54.859959 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5af0f174-66f8-4ba9-a0fb-b05fed19fccf-proxy-tls\") pod \"raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z\" (UID: \"5af0f174-66f8-4ba9-a0fb-b05fed19fccf\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" Apr 24 22:48:54.860069 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:54.859986 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5af0f174-66f8-4ba9-a0fb-b05fed19fccf-kserve-provision-location\") pod \"raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z\" (UID: \"5af0f174-66f8-4ba9-a0fb-b05fed19fccf\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" Apr 24 22:48:54.860160 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:54.860072 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5dlf\" (UniqueName: \"kubernetes.io/projected/5af0f174-66f8-4ba9-a0fb-b05fed19fccf-kube-api-access-q5dlf\") pod \"raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z\" (UID: \"5af0f174-66f8-4ba9-a0fb-b05fed19fccf\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" Apr 24 22:48:54.960842 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:54.960780 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5af0f174-66f8-4ba9-a0fb-b05fed19fccf-proxy-tls\") pod \"raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z\" (UID: \"5af0f174-66f8-4ba9-a0fb-b05fed19fccf\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" Apr 24 22:48:54.960842 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:54.960812 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5af0f174-66f8-4ba9-a0fb-b05fed19fccf-kserve-provision-location\") pod \"raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z\" (UID: \"5af0f174-66f8-4ba9-a0fb-b05fed19fccf\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" Apr 24 22:48:54.960842 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:54.960838 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q5dlf\" (UniqueName: \"kubernetes.io/projected/5af0f174-66f8-4ba9-a0fb-b05fed19fccf-kube-api-access-q5dlf\") pod \"raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z\" (UID: \"5af0f174-66f8-4ba9-a0fb-b05fed19fccf\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" Apr 24 22:48:54.961036 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:54.960886 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-runtime-4d001-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5af0f174-66f8-4ba9-a0fb-b05fed19fccf-raw-sklearn-runtime-4d001-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z\" (UID: \"5af0f174-66f8-4ba9-a0fb-b05fed19fccf\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" Apr 24 22:48:54.961036 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:48:54.960938 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-serving-cert: secret "raw-sklearn-runtime-4d001-predictor-serving-cert" not found Apr 24 22:48:54.961036 ip-10-0-137-103 kubenswrapper[2572]: E0424 22:48:54.961004 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5af0f174-66f8-4ba9-a0fb-b05fed19fccf-proxy-tls podName:5af0f174-66f8-4ba9-a0fb-b05fed19fccf nodeName:}" failed. No retries permitted until 2026-04-24 22:48:55.460984249 +0000 UTC m=+1142.609289790 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5af0f174-66f8-4ba9-a0fb-b05fed19fccf-proxy-tls") pod "raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" (UID: "5af0f174-66f8-4ba9-a0fb-b05fed19fccf") : secret "raw-sklearn-runtime-4d001-predictor-serving-cert" not found Apr 24 22:48:54.961238 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:54.961217 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5af0f174-66f8-4ba9-a0fb-b05fed19fccf-kserve-provision-location\") pod \"raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z\" (UID: \"5af0f174-66f8-4ba9-a0fb-b05fed19fccf\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" Apr 24 22:48:54.961518 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:54.961501 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-runtime-4d001-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5af0f174-66f8-4ba9-a0fb-b05fed19fccf-raw-sklearn-runtime-4d001-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z\" (UID: \"5af0f174-66f8-4ba9-a0fb-b05fed19fccf\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" Apr 24 22:48:54.969952 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:54.969931 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5dlf\" (UniqueName: \"kubernetes.io/projected/5af0f174-66f8-4ba9-a0fb-b05fed19fccf-kube-api-access-q5dlf\") pod \"raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z\" (UID: \"5af0f174-66f8-4ba9-a0fb-b05fed19fccf\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" Apr 24 22:48:55.188076 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:55.188050 2572 generic.go:358] "Generic (PLEG): container finished" podID="595d3465-735f-451b-8411-9913eb253a29" containerID="52a47b2ee82d6ffa86249379f52af907c5a476b80a23fb0ce05c94912f7178ba" exitCode=2 Apr 24 22:48:55.188174 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:55.188122 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" event={"ID":"595d3465-735f-451b-8411-9913eb253a29","Type":"ContainerDied","Data":"52a47b2ee82d6ffa86249379f52af907c5a476b80a23fb0ce05c94912f7178ba"} Apr 24 22:48:55.465263 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:55.465224 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5af0f174-66f8-4ba9-a0fb-b05fed19fccf-proxy-tls\") pod \"raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z\" (UID: \"5af0f174-66f8-4ba9-a0fb-b05fed19fccf\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" Apr 24 22:48:55.467456 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:55.467434 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5af0f174-66f8-4ba9-a0fb-b05fed19fccf-proxy-tls\") pod \"raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z\" (UID: \"5af0f174-66f8-4ba9-a0fb-b05fed19fccf\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" Apr 24 22:48:55.653745 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:55.653713 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" Apr 24 22:48:55.767396 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:55.767309 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z"] Apr 24 22:48:55.769721 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:48:55.769695 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5af0f174_66f8_4ba9_a0fb_b05fed19fccf.slice/crio-a4ebde482ae0ad29cc01e118de430bcce378320db9a0ab15e01a705b7763a04c WatchSource:0}: Error finding container a4ebde482ae0ad29cc01e118de430bcce378320db9a0ab15e01a705b7763a04c: Status 404 returned error can't find the container with id a4ebde482ae0ad29cc01e118de430bcce378320db9a0ab15e01a705b7763a04c Apr 24 22:48:56.193108 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:56.193072 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" event={"ID":"5af0f174-66f8-4ba9-a0fb-b05fed19fccf","Type":"ContainerStarted","Data":"cbd46ea13eefdd474d2f99d541bd6520795d05cb016195ec6a6b5218a3a1eca6"} Apr 24 22:48:56.193276 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:56.193115 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" event={"ID":"5af0f174-66f8-4ba9-a0fb-b05fed19fccf","Type":"ContainerStarted","Data":"a4ebde482ae0ad29cc01e118de430bcce378320db9a0ab15e01a705b7763a04c"} Apr 24 22:48:57.978076 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:57.978034 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" podUID="595d3465-735f-451b-8411-9913eb253a29" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.36:8643/healthz\": dial tcp 10.134.0.36:8643: connect: connection refused" Apr 24 22:48:57.982358 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:57.982330 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" podUID="595d3465-735f-451b-8411-9913eb253a29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 22:48:58.201301 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:58.201273 2572 generic.go:358] "Generic (PLEG): container finished" podID="595d3465-735f-451b-8411-9913eb253a29" containerID="7f53e002dc95c4be37ae50d3134ec6afd59ba1d1209f6f2f9cf8b9a709d1e2ab" exitCode=0 Apr 24 22:48:58.201431 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:58.201310 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" event={"ID":"595d3465-735f-451b-8411-9913eb253a29","Type":"ContainerDied","Data":"7f53e002dc95c4be37ae50d3134ec6afd59ba1d1209f6f2f9cf8b9a709d1e2ab"} Apr 24 22:48:58.286683 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:58.286659 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" Apr 24 22:48:58.390130 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:58.390106 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/595d3465-735f-451b-8411-9913eb253a29-proxy-tls\") pod \"595d3465-735f-451b-8411-9913eb253a29\" (UID: \"595d3465-735f-451b-8411-9913eb253a29\") " Apr 24 22:48:58.390250 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:58.390145 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/595d3465-735f-451b-8411-9913eb253a29-kserve-provision-location\") pod \"595d3465-735f-451b-8411-9913eb253a29\" (UID: \"595d3465-735f-451b-8411-9913eb253a29\") " Apr 24 22:48:58.390250 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:58.390189 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5v4q\" (UniqueName: \"kubernetes.io/projected/595d3465-735f-451b-8411-9913eb253a29-kube-api-access-m5v4q\") pod \"595d3465-735f-451b-8411-9913eb253a29\" (UID: \"595d3465-735f-451b-8411-9913eb253a29\") " Apr 24 22:48:58.390250 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:58.390237 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-b6db3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/595d3465-735f-451b-8411-9913eb253a29-raw-sklearn-b6db3-kube-rbac-proxy-sar-config\") pod \"595d3465-735f-451b-8411-9913eb253a29\" (UID: \"595d3465-735f-451b-8411-9913eb253a29\") " Apr 24 22:48:58.390561 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:58.390517 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/595d3465-735f-451b-8411-9913eb253a29-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "595d3465-735f-451b-8411-9913eb253a29" (UID: "595d3465-735f-451b-8411-9913eb253a29"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:48:58.390676 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:58.390603 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/595d3465-735f-451b-8411-9913eb253a29-raw-sklearn-b6db3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-b6db3-kube-rbac-proxy-sar-config") pod "595d3465-735f-451b-8411-9913eb253a29" (UID: "595d3465-735f-451b-8411-9913eb253a29"). InnerVolumeSpecName "raw-sklearn-b6db3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:48:58.392227 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:58.392205 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/595d3465-735f-451b-8411-9913eb253a29-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "595d3465-735f-451b-8411-9913eb253a29" (UID: "595d3465-735f-451b-8411-9913eb253a29"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:48:58.392315 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:58.392288 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/595d3465-735f-451b-8411-9913eb253a29-kube-api-access-m5v4q" (OuterVolumeSpecName: "kube-api-access-m5v4q") pod "595d3465-735f-451b-8411-9913eb253a29" (UID: "595d3465-735f-451b-8411-9913eb253a29"). InnerVolumeSpecName "kube-api-access-m5v4q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:48:58.490901 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:58.490866 2572 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-b6db3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/595d3465-735f-451b-8411-9913eb253a29-raw-sklearn-b6db3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:48:58.490901 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:58.490899 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/595d3465-735f-451b-8411-9913eb253a29-proxy-tls\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:48:58.491047 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:58.490909 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/595d3465-735f-451b-8411-9913eb253a29-kserve-provision-location\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:48:58.491047 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:58.490921 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m5v4q\" (UniqueName: \"kubernetes.io/projected/595d3465-735f-451b-8411-9913eb253a29-kube-api-access-m5v4q\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:48:59.206425 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:59.206390 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" event={"ID":"595d3465-735f-451b-8411-9913eb253a29","Type":"ContainerDied","Data":"ed126c45c0051dc82b82588cf7ee7c9cfd6ca170e36643caae7e11cdf4f437dd"} Apr 24 22:48:59.206425 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:59.206413 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l" Apr 24 22:48:59.206876 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:59.206449 2572 scope.go:117] "RemoveContainer" containerID="52a47b2ee82d6ffa86249379f52af907c5a476b80a23fb0ce05c94912f7178ba" Apr 24 22:48:59.215864 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:59.215846 2572 scope.go:117] "RemoveContainer" containerID="7f53e002dc95c4be37ae50d3134ec6afd59ba1d1209f6f2f9cf8b9a709d1e2ab" Apr 24 22:48:59.223634 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:59.223611 2572 scope.go:117] "RemoveContainer" containerID="696ff6123011db41698133c90d57d4b0b8e711380b7efb1282e280e2ef5b3d04" Apr 24 22:48:59.228909 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:59.228886 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l"] Apr 24 22:48:59.232029 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:59.231989 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-b6db3-predictor-6544bf9b66-j6v7l"] Apr 24 22:48:59.418913 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:48:59.418867 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="595d3465-735f-451b-8411-9913eb253a29" path="/var/lib/kubelet/pods/595d3465-735f-451b-8411-9913eb253a29/volumes" Apr 24 22:49:00.211614 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:49:00.211578 2572 generic.go:358] "Generic (PLEG): container finished" podID="5af0f174-66f8-4ba9-a0fb-b05fed19fccf" containerID="cbd46ea13eefdd474d2f99d541bd6520795d05cb016195ec6a6b5218a3a1eca6" exitCode=0 Apr 24 22:49:00.211969 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:49:00.211618 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" event={"ID":"5af0f174-66f8-4ba9-a0fb-b05fed19fccf","Type":"ContainerDied","Data":"cbd46ea13eefdd474d2f99d541bd6520795d05cb016195ec6a6b5218a3a1eca6"} Apr 24 22:49:01.216684 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:49:01.216650 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" event={"ID":"5af0f174-66f8-4ba9-a0fb-b05fed19fccf","Type":"ContainerStarted","Data":"1fd30eeb38e19b8de4b7b6df31f114142b210457ece06b7e29cdf6278ee8e02a"} Apr 24 22:49:01.216684 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:49:01.216691 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" event={"ID":"5af0f174-66f8-4ba9-a0fb-b05fed19fccf","Type":"ContainerStarted","Data":"7019c51baf36bfe369a36ac752098ea3aa04f1c118b3658b6bede56d13972bbd"} Apr 24 22:49:01.217092 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:49:01.216892 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" Apr 24 22:49:01.235353 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:49:01.235310 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" podStartSLOduration=7.235297746 podStartE2EDuration="7.235297746s" podCreationTimestamp="2026-04-24 22:48:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:49:01.23341653 +0000 UTC m=+1148.381722080" watchObservedRunningTime="2026-04-24 22:49:01.235297746 +0000 UTC m=+1148.383603294" Apr 24 22:49:02.219309 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:49:02.219275 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" Apr 24 22:49:02.220601 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:49:02.220570 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" podUID="5af0f174-66f8-4ba9-a0fb-b05fed19fccf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 22:49:03.221850 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:49:03.221812 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" podUID="5af0f174-66f8-4ba9-a0fb-b05fed19fccf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 22:49:08.226381 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:49:08.226307 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" Apr 24 22:49:08.226958 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:49:08.226929 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" podUID="5af0f174-66f8-4ba9-a0fb-b05fed19fccf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 22:49:18.227458 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:49:18.227419 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" podUID="5af0f174-66f8-4ba9-a0fb-b05fed19fccf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 22:49:28.227136 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:49:28.227100 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" podUID="5af0f174-66f8-4ba9-a0fb-b05fed19fccf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 22:49:38.227128 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:49:38.227088 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" podUID="5af0f174-66f8-4ba9-a0fb-b05fed19fccf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 22:49:48.226817 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:49:48.226779 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" podUID="5af0f174-66f8-4ba9-a0fb-b05fed19fccf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 22:49:58.226910 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:49:58.226867 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" podUID="5af0f174-66f8-4ba9-a0fb-b05fed19fccf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 22:50:08.228207 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:08.228169 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" Apr 24 22:50:14.810983 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:14.810950 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z"] Apr 24 22:50:14.811464 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:14.811365 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" podUID="5af0f174-66f8-4ba9-a0fb-b05fed19fccf" containerName="kserve-container" containerID="cri-o://7019c51baf36bfe369a36ac752098ea3aa04f1c118b3658b6bede56d13972bbd" gracePeriod=30 Apr 24 22:50:14.811536 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:14.811434 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" podUID="5af0f174-66f8-4ba9-a0fb-b05fed19fccf" containerName="kube-rbac-proxy" containerID="cri-o://1fd30eeb38e19b8de4b7b6df31f114142b210457ece06b7e29cdf6278ee8e02a" gracePeriod=30 Apr 24 22:50:15.448621 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:15.448588 2572 generic.go:358] "Generic (PLEG): container finished" podID="5af0f174-66f8-4ba9-a0fb-b05fed19fccf" containerID="1fd30eeb38e19b8de4b7b6df31f114142b210457ece06b7e29cdf6278ee8e02a" exitCode=2 Apr 24 22:50:15.448765 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:15.448624 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" event={"ID":"5af0f174-66f8-4ba9-a0fb-b05fed19fccf","Type":"ContainerDied","Data":"1fd30eeb38e19b8de4b7b6df31f114142b210457ece06b7e29cdf6278ee8e02a"} Apr 24 22:50:16.180165 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:16.180128 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hjfmx/must-gather-4zbvj"] Apr 24 22:50:16.180508 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:16.180448 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="595d3465-735f-451b-8411-9913eb253a29" containerName="storage-initializer" Apr 24 22:50:16.180508 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:16.180457 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="595d3465-735f-451b-8411-9913eb253a29" containerName="storage-initializer" Apr 24 22:50:16.180508 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:16.180469 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="595d3465-735f-451b-8411-9913eb253a29" containerName="kserve-container" Apr 24 22:50:16.180508 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:16.180474 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="595d3465-735f-451b-8411-9913eb253a29" containerName="kserve-container" Apr 24 22:50:16.180508 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:16.180483 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="595d3465-735f-451b-8411-9913eb253a29" containerName="kube-rbac-proxy" Apr 24 22:50:16.180508 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:16.180489 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="595d3465-735f-451b-8411-9913eb253a29" containerName="kube-rbac-proxy" Apr 24 22:50:16.180697 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:16.180554 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="595d3465-735f-451b-8411-9913eb253a29" containerName="kserve-container" Apr 24 22:50:16.180697 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:16.180566 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="595d3465-735f-451b-8411-9913eb253a29" containerName="kube-rbac-proxy" Apr 24 22:50:16.183581 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:16.183566 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hjfmx/must-gather-4zbvj" Apr 24 22:50:16.186169 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:16.186146 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hjfmx\"/\"openshift-service-ca.crt\"" Apr 24 22:50:16.187399 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:16.187376 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-hjfmx\"/\"default-dockercfg-gpqpw\"" Apr 24 22:50:16.187578 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:16.187443 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hjfmx\"/\"kube-root-ca.crt\"" Apr 24 22:50:16.189997 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:16.189977 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hjfmx/must-gather-4zbvj"] Apr 24 22:50:16.238572 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:16.238549 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq4j8\" (UniqueName: \"kubernetes.io/projected/3b896dfa-f45b-49b2-bc7a-c2446f500c54-kube-api-access-qq4j8\") pod \"must-gather-4zbvj\" (UID: \"3b896dfa-f45b-49b2-bc7a-c2446f500c54\") " pod="openshift-must-gather-hjfmx/must-gather-4zbvj" Apr 24 22:50:16.238674 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:16.238598 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3b896dfa-f45b-49b2-bc7a-c2446f500c54-must-gather-output\") pod \"must-gather-4zbvj\" (UID: \"3b896dfa-f45b-49b2-bc7a-c2446f500c54\") " pod="openshift-must-gather-hjfmx/must-gather-4zbvj" Apr 24 22:50:16.339855 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:16.339833 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qq4j8\" (UniqueName: \"kubernetes.io/projected/3b896dfa-f45b-49b2-bc7a-c2446f500c54-kube-api-access-qq4j8\") pod \"must-gather-4zbvj\" (UID: \"3b896dfa-f45b-49b2-bc7a-c2446f500c54\") " pod="openshift-must-gather-hjfmx/must-gather-4zbvj" Apr 24 22:50:16.339959 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:16.339877 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3b896dfa-f45b-49b2-bc7a-c2446f500c54-must-gather-output\") pod \"must-gather-4zbvj\" (UID: \"3b896dfa-f45b-49b2-bc7a-c2446f500c54\") " pod="openshift-must-gather-hjfmx/must-gather-4zbvj" Apr 24 22:50:16.340157 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:16.340142 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3b896dfa-f45b-49b2-bc7a-c2446f500c54-must-gather-output\") pod \"must-gather-4zbvj\" (UID: \"3b896dfa-f45b-49b2-bc7a-c2446f500c54\") " pod="openshift-must-gather-hjfmx/must-gather-4zbvj" Apr 24 22:50:16.349151 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:16.349131 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq4j8\" (UniqueName: \"kubernetes.io/projected/3b896dfa-f45b-49b2-bc7a-c2446f500c54-kube-api-access-qq4j8\") pod \"must-gather-4zbvj\" (UID: \"3b896dfa-f45b-49b2-bc7a-c2446f500c54\") " pod="openshift-must-gather-hjfmx/must-gather-4zbvj" Apr 24 22:50:16.502754 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:16.502728 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hjfmx/must-gather-4zbvj" Apr 24 22:50:16.620769 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:16.620740 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hjfmx/must-gather-4zbvj"] Apr 24 22:50:16.623762 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:50:16.623736 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b896dfa_f45b_49b2_bc7a_c2446f500c54.slice/crio-b12d24be63255818ef95b3bca7fbfae79700a9a82ee074bdbde3adb3668dc255 WatchSource:0}: Error finding container b12d24be63255818ef95b3bca7fbfae79700a9a82ee074bdbde3adb3668dc255: Status 404 returned error can't find the container with id b12d24be63255818ef95b3bca7fbfae79700a9a82ee074bdbde3adb3668dc255 Apr 24 22:50:17.462314 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:17.462241 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hjfmx/must-gather-4zbvj" event={"ID":"3b896dfa-f45b-49b2-bc7a-c2446f500c54","Type":"ContainerStarted","Data":"b12d24be63255818ef95b3bca7fbfae79700a9a82ee074bdbde3adb3668dc255"} Apr 24 22:50:18.222783 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:18.222734 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" podUID="5af0f174-66f8-4ba9-a0fb-b05fed19fccf" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.37:8643/healthz\": dial tcp 10.134.0.37:8643: connect: connection refused" Apr 24 22:50:18.227156 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:18.227123 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" podUID="5af0f174-66f8-4ba9-a0fb-b05fed19fccf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 22:50:19.472217 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:19.472177 2572 generic.go:358] "Generic (PLEG): container finished" podID="5af0f174-66f8-4ba9-a0fb-b05fed19fccf" containerID="7019c51baf36bfe369a36ac752098ea3aa04f1c118b3658b6bede56d13972bbd" exitCode=0 Apr 24 22:50:19.472651 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:19.472229 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" event={"ID":"5af0f174-66f8-4ba9-a0fb-b05fed19fccf","Type":"ContainerDied","Data":"7019c51baf36bfe369a36ac752098ea3aa04f1c118b3658b6bede56d13972bbd"} Apr 24 22:50:20.512620 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:20.512597 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" Apr 24 22:50:20.575458 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:20.575433 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-runtime-4d001-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5af0f174-66f8-4ba9-a0fb-b05fed19fccf-raw-sklearn-runtime-4d001-kube-rbac-proxy-sar-config\") pod \"5af0f174-66f8-4ba9-a0fb-b05fed19fccf\" (UID: \"5af0f174-66f8-4ba9-a0fb-b05fed19fccf\") " Apr 24 22:50:20.575585 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:20.575524 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5af0f174-66f8-4ba9-a0fb-b05fed19fccf-kserve-provision-location\") pod \"5af0f174-66f8-4ba9-a0fb-b05fed19fccf\" (UID: \"5af0f174-66f8-4ba9-a0fb-b05fed19fccf\") " Apr 24 22:50:20.575585 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:20.575556 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5af0f174-66f8-4ba9-a0fb-b05fed19fccf-proxy-tls\") pod \"5af0f174-66f8-4ba9-a0fb-b05fed19fccf\" (UID: \"5af0f174-66f8-4ba9-a0fb-b05fed19fccf\") " Apr 24 22:50:20.575585 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:20.575574 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5dlf\" (UniqueName: \"kubernetes.io/projected/5af0f174-66f8-4ba9-a0fb-b05fed19fccf-kube-api-access-q5dlf\") pod \"5af0f174-66f8-4ba9-a0fb-b05fed19fccf\" (UID: \"5af0f174-66f8-4ba9-a0fb-b05fed19fccf\") " Apr 24 22:50:20.575830 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:20.575799 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5af0f174-66f8-4ba9-a0fb-b05fed19fccf-raw-sklearn-runtime-4d001-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-runtime-4d001-kube-rbac-proxy-sar-config") pod "5af0f174-66f8-4ba9-a0fb-b05fed19fccf" (UID: "5af0f174-66f8-4ba9-a0fb-b05fed19fccf"). InnerVolumeSpecName "raw-sklearn-runtime-4d001-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:50:20.575926 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:20.575866 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5af0f174-66f8-4ba9-a0fb-b05fed19fccf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5af0f174-66f8-4ba9-a0fb-b05fed19fccf" (UID: "5af0f174-66f8-4ba9-a0fb-b05fed19fccf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:50:20.577410 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:20.577388 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5af0f174-66f8-4ba9-a0fb-b05fed19fccf-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5af0f174-66f8-4ba9-a0fb-b05fed19fccf" (UID: "5af0f174-66f8-4ba9-a0fb-b05fed19fccf"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:50:20.577973 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:20.577952 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5af0f174-66f8-4ba9-a0fb-b05fed19fccf-kube-api-access-q5dlf" (OuterVolumeSpecName: "kube-api-access-q5dlf") pod "5af0f174-66f8-4ba9-a0fb-b05fed19fccf" (UID: "5af0f174-66f8-4ba9-a0fb-b05fed19fccf"). InnerVolumeSpecName "kube-api-access-q5dlf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:50:20.676793 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:20.676709 2572 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-runtime-4d001-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5af0f174-66f8-4ba9-a0fb-b05fed19fccf-raw-sklearn-runtime-4d001-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:50:20.676793 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:20.676742 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5af0f174-66f8-4ba9-a0fb-b05fed19fccf-kserve-provision-location\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:50:20.676793 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:20.676758 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5af0f174-66f8-4ba9-a0fb-b05fed19fccf-proxy-tls\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:50:20.676793 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:20.676773 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q5dlf\" (UniqueName: \"kubernetes.io/projected/5af0f174-66f8-4ba9-a0fb-b05fed19fccf-kube-api-access-q5dlf\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:50:21.484111 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:21.484074 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" event={"ID":"5af0f174-66f8-4ba9-a0fb-b05fed19fccf","Type":"ContainerDied","Data":"a4ebde482ae0ad29cc01e118de430bcce378320db9a0ab15e01a705b7763a04c"} Apr 24 22:50:21.484111 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:21.484113 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z" Apr 24 22:50:21.484363 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:21.484131 2572 scope.go:117] "RemoveContainer" containerID="1fd30eeb38e19b8de4b7b6df31f114142b210457ece06b7e29cdf6278ee8e02a" Apr 24 22:50:21.485951 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:21.485914 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hjfmx/must-gather-4zbvj" event={"ID":"3b896dfa-f45b-49b2-bc7a-c2446f500c54","Type":"ContainerStarted","Data":"47e9b72209ec56d834521b2299821aec1bac5df84d9b89e056ccb6b870a3de89"} Apr 24 22:50:21.486067 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:21.485960 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hjfmx/must-gather-4zbvj" event={"ID":"3b896dfa-f45b-49b2-bc7a-c2446f500c54","Type":"ContainerStarted","Data":"4aa8c27006c6c7186616ad911e470a2fe1ca0c8f54f066bf0c40a2c4e31465d7"} Apr 24 22:50:21.492976 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:21.492952 2572 scope.go:117] "RemoveContainer" containerID="7019c51baf36bfe369a36ac752098ea3aa04f1c118b3658b6bede56d13972bbd" Apr 24 22:50:21.505499 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:21.505478 2572 scope.go:117] "RemoveContainer" containerID="cbd46ea13eefdd474d2f99d541bd6520795d05cb016195ec6a6b5218a3a1eca6" Apr 24 22:50:21.506623 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:21.506585 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hjfmx/must-gather-4zbvj" podStartSLOduration=1.521775375 podStartE2EDuration="5.506572709s" podCreationTimestamp="2026-04-24 22:50:16 +0000 UTC" firstStartedPulling="2026-04-24 22:50:16.6253414 +0000 UTC m=+1223.773646925" lastFinishedPulling="2026-04-24 22:50:20.610138732 +0000 UTC m=+1227.758444259" observedRunningTime="2026-04-24 22:50:21.504372151 +0000 UTC m=+1228.652677699" watchObservedRunningTime="2026-04-24 22:50:21.506572709 +0000 UTC m=+1228.654878278" Apr 24 22:50:21.518154 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:21.518134 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z"] Apr 24 22:50:21.525223 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:21.525203 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-4d001-predictor-79b9cb5cf6-frt5z"] Apr 24 22:50:23.417686 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:23.417650 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5af0f174-66f8-4ba9-a0fb-b05fed19fccf" path="/var/lib/kubelet/pods/5af0f174-66f8-4ba9-a0fb-b05fed19fccf/volumes" Apr 24 22:50:38.540938 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:38.540905 2572 generic.go:358] "Generic (PLEG): container finished" podID="3b896dfa-f45b-49b2-bc7a-c2446f500c54" containerID="4aa8c27006c6c7186616ad911e470a2fe1ca0c8f54f066bf0c40a2c4e31465d7" exitCode=0 Apr 24 22:50:38.541443 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:38.540982 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hjfmx/must-gather-4zbvj" event={"ID":"3b896dfa-f45b-49b2-bc7a-c2446f500c54","Type":"ContainerDied","Data":"4aa8c27006c6c7186616ad911e470a2fe1ca0c8f54f066bf0c40a2c4e31465d7"} Apr 24 22:50:38.541443 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:38.541396 2572 scope.go:117] "RemoveContainer" containerID="4aa8c27006c6c7186616ad911e470a2fe1ca0c8f54f066bf0c40a2c4e31465d7" Apr 24 22:50:39.068450 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:39.068422 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hjfmx_must-gather-4zbvj_3b896dfa-f45b-49b2-bc7a-c2446f500c54/gather/0.log" Apr 24 22:50:39.591828 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:39.591804 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h9lwj/must-gather-ml74x"] Apr 24 22:50:39.592206 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:39.592161 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5af0f174-66f8-4ba9-a0fb-b05fed19fccf" containerName="storage-initializer" Apr 24 22:50:39.592206 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:39.592174 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5af0f174-66f8-4ba9-a0fb-b05fed19fccf" containerName="storage-initializer" Apr 24 22:50:39.592206 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:39.592184 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5af0f174-66f8-4ba9-a0fb-b05fed19fccf" containerName="kserve-container" Apr 24 22:50:39.592206 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:39.592190 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5af0f174-66f8-4ba9-a0fb-b05fed19fccf" containerName="kserve-container" Apr 24 22:50:39.592206 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:39.592200 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5af0f174-66f8-4ba9-a0fb-b05fed19fccf" containerName="kube-rbac-proxy" Apr 24 22:50:39.592206 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:39.592206 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5af0f174-66f8-4ba9-a0fb-b05fed19fccf" containerName="kube-rbac-proxy" Apr 24 22:50:39.592398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:39.592260 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="5af0f174-66f8-4ba9-a0fb-b05fed19fccf" containerName="kube-rbac-proxy" Apr 24 22:50:39.592398 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:39.592270 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="5af0f174-66f8-4ba9-a0fb-b05fed19fccf" containerName="kserve-container" Apr 24 22:50:39.595372 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:39.595356 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9lwj/must-gather-ml74x" Apr 24 22:50:39.597798 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:39.597777 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-h9lwj\"/\"kube-root-ca.crt\"" Apr 24 22:50:39.597923 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:39.597900 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-h9lwj\"/\"default-dockercfg-nh675\"" Apr 24 22:50:39.598904 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:39.598888 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-h9lwj\"/\"openshift-service-ca.crt\"" Apr 24 22:50:39.602390 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:39.602185 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h9lwj/must-gather-ml74x"] Apr 24 22:50:39.741177 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:39.741147 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8gwn\" (UniqueName: \"kubernetes.io/projected/41701280-591c-47b6-91f8-c09e6fbbde8e-kube-api-access-h8gwn\") pod \"must-gather-ml74x\" (UID: \"41701280-591c-47b6-91f8-c09e6fbbde8e\") " pod="openshift-must-gather-h9lwj/must-gather-ml74x" Apr 24 22:50:39.741297 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:39.741223 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/41701280-591c-47b6-91f8-c09e6fbbde8e-must-gather-output\") pod \"must-gather-ml74x\" (UID: \"41701280-591c-47b6-91f8-c09e6fbbde8e\") " pod="openshift-must-gather-h9lwj/must-gather-ml74x" Apr 24 22:50:39.842195 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:39.842135 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/41701280-591c-47b6-91f8-c09e6fbbde8e-must-gather-output\") pod \"must-gather-ml74x\" (UID: \"41701280-591c-47b6-91f8-c09e6fbbde8e\") " pod="openshift-must-gather-h9lwj/must-gather-ml74x" Apr 24 22:50:39.842195 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:39.842181 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8gwn\" (UniqueName: \"kubernetes.io/projected/41701280-591c-47b6-91f8-c09e6fbbde8e-kube-api-access-h8gwn\") pod \"must-gather-ml74x\" (UID: \"41701280-591c-47b6-91f8-c09e6fbbde8e\") " pod="openshift-must-gather-h9lwj/must-gather-ml74x" Apr 24 22:50:39.842451 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:39.842431 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/41701280-591c-47b6-91f8-c09e6fbbde8e-must-gather-output\") pod \"must-gather-ml74x\" (UID: \"41701280-591c-47b6-91f8-c09e6fbbde8e\") " pod="openshift-must-gather-h9lwj/must-gather-ml74x" Apr 24 22:50:39.851760 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:39.851736 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8gwn\" (UniqueName: \"kubernetes.io/projected/41701280-591c-47b6-91f8-c09e6fbbde8e-kube-api-access-h8gwn\") pod \"must-gather-ml74x\" (UID: \"41701280-591c-47b6-91f8-c09e6fbbde8e\") " pod="openshift-must-gather-h9lwj/must-gather-ml74x" Apr 24 22:50:39.904819 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:39.904802 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9lwj/must-gather-ml74x" Apr 24 22:50:40.022815 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:40.022671 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h9lwj/must-gather-ml74x"] Apr 24 22:50:40.025131 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:50:40.025104 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41701280_591c_47b6_91f8_c09e6fbbde8e.slice/crio-ca138c20fde66e75c2c85c6025062004904487b01dcc380a996208c6681bae55 WatchSource:0}: Error finding container ca138c20fde66e75c2c85c6025062004904487b01dcc380a996208c6681bae55: Status 404 returned error can't find the container with id ca138c20fde66e75c2c85c6025062004904487b01dcc380a996208c6681bae55 Apr 24 22:50:40.547842 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:40.547807 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h9lwj/must-gather-ml74x" event={"ID":"41701280-591c-47b6-91f8-c09e6fbbde8e","Type":"ContainerStarted","Data":"ca138c20fde66e75c2c85c6025062004904487b01dcc380a996208c6681bae55"} Apr 24 22:50:41.554316 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:41.554273 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h9lwj/must-gather-ml74x" event={"ID":"41701280-591c-47b6-91f8-c09e6fbbde8e","Type":"ContainerStarted","Data":"fc24066797da507d804055aaa5d9ce509c62e3dde1d32fb6eb83f802ec122857"} Apr 24 22:50:41.554782 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:41.554324 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h9lwj/must-gather-ml74x" event={"ID":"41701280-591c-47b6-91f8-c09e6fbbde8e","Type":"ContainerStarted","Data":"1530fc3e3b0564331fa63bb061240cd2277a94718b90115b027be2d3c17e9898"} Apr 24 22:50:41.575386 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:41.575324 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h9lwj/must-gather-ml74x" podStartSLOduration=1.6749943109999998 podStartE2EDuration="2.575305831s" podCreationTimestamp="2026-04-24 22:50:39 +0000 UTC" firstStartedPulling="2026-04-24 22:50:40.027201908 +0000 UTC m=+1247.175507433" lastFinishedPulling="2026-04-24 22:50:40.927513427 +0000 UTC m=+1248.075818953" observedRunningTime="2026-04-24 22:50:41.573646567 +0000 UTC m=+1248.721952116" watchObservedRunningTime="2026-04-24 22:50:41.575305831 +0000 UTC m=+1248.723611381" Apr 24 22:50:42.344307 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:42.344244 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-bnnl8_10c617f4-fae1-4f73-af0e-8e8500ece009/global-pull-secret-syncer/0.log" Apr 24 22:50:42.562182 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:42.562154 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-pqcxg_987422f7-15db-4df1-8e08-87491e9648dd/konnectivity-agent/0.log" Apr 24 22:50:42.626670 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:42.626600 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-103.ec2.internal_02aaeaca2e7902be6bed83a0bfe87afe/haproxy/0.log" Apr 24 22:50:44.445062 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:44.445007 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hjfmx/must-gather-4zbvj"] Apr 24 22:50:44.445580 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:44.445310 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-hjfmx/must-gather-4zbvj" podUID="3b896dfa-f45b-49b2-bc7a-c2446f500c54" containerName="copy" containerID="cri-o://47e9b72209ec56d834521b2299821aec1bac5df84d9b89e056ccb6b870a3de89" gracePeriod=2 Apr 24 22:50:44.448415 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:44.448387 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hjfmx/must-gather-4zbvj"] Apr 24 22:50:44.448625 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:44.448573 2572 status_manager.go:895] "Failed to get status for pod" podUID="3b896dfa-f45b-49b2-bc7a-c2446f500c54" pod="openshift-must-gather-hjfmx/must-gather-4zbvj" err="pods \"must-gather-4zbvj\" is forbidden: User \"system:node:ip-10-0-137-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-hjfmx\": no relationship found between node 'ip-10-0-137-103.ec2.internal' and this object" Apr 24 22:50:44.589718 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:44.589690 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hjfmx_must-gather-4zbvj_3b896dfa-f45b-49b2-bc7a-c2446f500c54/copy/0.log" Apr 24 22:50:44.590622 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:44.590519 2572 generic.go:358] "Generic (PLEG): container finished" podID="3b896dfa-f45b-49b2-bc7a-c2446f500c54" containerID="47e9b72209ec56d834521b2299821aec1bac5df84d9b89e056ccb6b870a3de89" exitCode=143 Apr 24 22:50:44.821048 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:44.820953 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hjfmx_must-gather-4zbvj_3b896dfa-f45b-49b2-bc7a-c2446f500c54/copy/0.log" Apr 24 22:50:44.821870 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:44.821633 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hjfmx/must-gather-4zbvj" Apr 24 22:50:44.823758 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:44.823705 2572 status_manager.go:895] "Failed to get status for pod" podUID="3b896dfa-f45b-49b2-bc7a-c2446f500c54" pod="openshift-must-gather-hjfmx/must-gather-4zbvj" err="pods \"must-gather-4zbvj\" is forbidden: User \"system:node:ip-10-0-137-103.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-hjfmx\": no relationship found between node 'ip-10-0-137-103.ec2.internal' and this object" Apr 24 22:50:44.992032 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:44.991274 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3b896dfa-f45b-49b2-bc7a-c2446f500c54-must-gather-output\") pod \"3b896dfa-f45b-49b2-bc7a-c2446f500c54\" (UID: \"3b896dfa-f45b-49b2-bc7a-c2446f500c54\") " Apr 24 22:50:44.992032 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:44.991353 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq4j8\" (UniqueName: \"kubernetes.io/projected/3b896dfa-f45b-49b2-bc7a-c2446f500c54-kube-api-access-qq4j8\") pod \"3b896dfa-f45b-49b2-bc7a-c2446f500c54\" (UID: \"3b896dfa-f45b-49b2-bc7a-c2446f500c54\") " Apr 24 22:50:44.993643 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:44.993597 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b896dfa-f45b-49b2-bc7a-c2446f500c54-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "3b896dfa-f45b-49b2-bc7a-c2446f500c54" (UID: "3b896dfa-f45b-49b2-bc7a-c2446f500c54"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:50:44.995307 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:44.994821 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b896dfa-f45b-49b2-bc7a-c2446f500c54-kube-api-access-qq4j8" (OuterVolumeSpecName: "kube-api-access-qq4j8") pod "3b896dfa-f45b-49b2-bc7a-c2446f500c54" (UID: "3b896dfa-f45b-49b2-bc7a-c2446f500c54"). InnerVolumeSpecName "kube-api-access-qq4j8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:50:45.092274 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:45.092181 2572 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3b896dfa-f45b-49b2-bc7a-c2446f500c54-must-gather-output\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:50:45.092274 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:45.092221 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qq4j8\" (UniqueName: \"kubernetes.io/projected/3b896dfa-f45b-49b2-bc7a-c2446f500c54-kube-api-access-qq4j8\") on node \"ip-10-0-137-103.ec2.internal\" DevicePath \"\"" Apr 24 22:50:45.423245 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:45.420503 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b896dfa-f45b-49b2-bc7a-c2446f500c54" path="/var/lib/kubelet/pods/3b896dfa-f45b-49b2-bc7a-c2446f500c54/volumes" Apr 24 22:50:45.596836 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:45.596795 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hjfmx_must-gather-4zbvj_3b896dfa-f45b-49b2-bc7a-c2446f500c54/copy/0.log" Apr 24 22:50:45.597364 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:45.597304 2572 scope.go:117] "RemoveContainer" containerID="47e9b72209ec56d834521b2299821aec1bac5df84d9b89e056ccb6b870a3de89" Apr 24 22:50:45.597512 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:45.597471 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hjfmx/must-gather-4zbvj" Apr 24 22:50:45.609186 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:45.608941 2572 scope.go:117] "RemoveContainer" containerID="4aa8c27006c6c7186616ad911e470a2fe1ca0c8f54f066bf0c40a2c4e31465d7" Apr 24 22:50:45.952506 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:45.952464 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e44a6833-98fd-4227-8575-b155c7daa7df/alertmanager/0.log" Apr 24 22:50:45.976197 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:45.976167 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e44a6833-98fd-4227-8575-b155c7daa7df/config-reloader/0.log" Apr 24 22:50:46.018715 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.018671 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e44a6833-98fd-4227-8575-b155c7daa7df/kube-rbac-proxy-web/0.log" Apr 24 22:50:46.079292 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.079265 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e44a6833-98fd-4227-8575-b155c7daa7df/kube-rbac-proxy/0.log" Apr 24 22:50:46.128401 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.128319 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e44a6833-98fd-4227-8575-b155c7daa7df/kube-rbac-proxy-metric/0.log" Apr 24 22:50:46.169528 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.169500 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e44a6833-98fd-4227-8575-b155c7daa7df/prom-label-proxy/0.log" Apr 24 22:50:46.194786 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.194749 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e44a6833-98fd-4227-8575-b155c7daa7df/init-config-reloader/0.log" Apr 24 22:50:46.314118 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.314088 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-9fmp9_b907e1af-4d6b-43b5-9af8-d5f2e469c573/kube-state-metrics/0.log" Apr 24 22:50:46.338479 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.338445 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-9fmp9_b907e1af-4d6b-43b5-9af8-d5f2e469c573/kube-rbac-proxy-main/0.log" Apr 24 22:50:46.358796 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.358770 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-9fmp9_b907e1af-4d6b-43b5-9af8-d5f2e469c573/kube-rbac-proxy-self/0.log" Apr 24 22:50:46.385165 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.385098 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-594c959ccb-pdjc9_81ec86e8-c0e4-4de3-89b8-6f4a08425ad4/metrics-server/0.log" Apr 24 22:50:46.513417 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.513385 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8bsz5_f3573d49-4d97-426a-86f2-6e6731507efa/node-exporter/0.log" Apr 24 22:50:46.531310 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.531278 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8bsz5_f3573d49-4d97-426a-86f2-6e6731507efa/kube-rbac-proxy/0.log" Apr 24 22:50:46.549667 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.549641 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8bsz5_f3573d49-4d97-426a-86f2-6e6731507efa/init-textfile/0.log" Apr 24 22:50:46.642258 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.642171 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-s4c96_e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48/kube-rbac-proxy-main/0.log" Apr 24 22:50:46.665698 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.665670 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-s4c96_e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48/kube-rbac-proxy-self/0.log" Apr 24 22:50:46.683968 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.683940 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-s4c96_e2ecbf78-e58f-4d9d-88ab-d6b3278c3e48/openshift-state-metrics/0.log" Apr 24 22:50:46.721720 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.721680 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4db0889d-c5f7-4627-b540-370829583e38/prometheus/0.log" Apr 24 22:50:46.738281 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.738246 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4db0889d-c5f7-4627-b540-370829583e38/config-reloader/0.log" Apr 24 22:50:46.757242 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.757216 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4db0889d-c5f7-4627-b540-370829583e38/thanos-sidecar/0.log" Apr 24 22:50:46.775460 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.775435 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4db0889d-c5f7-4627-b540-370829583e38/kube-rbac-proxy-web/0.log" Apr 24 22:50:46.800178 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.800146 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4db0889d-c5f7-4627-b540-370829583e38/kube-rbac-proxy/0.log" Apr 24 22:50:46.821582 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.821555 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4db0889d-c5f7-4627-b540-370829583e38/kube-rbac-proxy-thanos/0.log" Apr 24 22:50:46.841805 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.841776 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4db0889d-c5f7-4627-b540-370829583e38/init-config-reloader/0.log" Apr 24 22:50:46.873695 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.873661 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-v84xt_19827ebc-0817-470a-95f6-3133e65a770d/prometheus-operator/0.log" Apr 24 22:50:46.894030 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.893940 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-v84xt_19827ebc-0817-470a-95f6-3133e65a770d/kube-rbac-proxy/0.log" Apr 24 22:50:46.919341 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.919308 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-9kw92_b1c4549d-5242-4209-9116-5088ce9fc89a/prometheus-operator-admission-webhook/0.log" Apr 24 22:50:46.947058 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.947030 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-685fd5dc87-ngx4q_6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1/telemeter-client/0.log" Apr 24 22:50:46.966734 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.966701 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-685fd5dc87-ngx4q_6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1/reload/0.log" Apr 24 22:50:46.989555 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:46.989518 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-685fd5dc87-ngx4q_6c6fbaa0-6698-42d2-90c3-3913ea9e8ea1/kube-rbac-proxy/0.log" Apr 24 22:50:49.180567 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:49.180537 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64496c466b-btqrw_bda88ed8-ba87-422a-8cef-507c0c26da57/console/0.log" Apr 24 22:50:49.351511 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:49.351470 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h9lwj/perf-node-gather-daemonset-bfgnh"] Apr 24 22:50:49.352059 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:49.352033 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3b896dfa-f45b-49b2-bc7a-c2446f500c54" containerName="copy" Apr 24 22:50:49.352242 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:49.352227 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b896dfa-f45b-49b2-bc7a-c2446f500c54" containerName="copy" Apr 24 22:50:49.352371 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:49.352358 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3b896dfa-f45b-49b2-bc7a-c2446f500c54" containerName="gather" Apr 24 22:50:49.352458 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:49.352450 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b896dfa-f45b-49b2-bc7a-c2446f500c54" containerName="gather" Apr 24 22:50:49.352618 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:49.352608 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="3b896dfa-f45b-49b2-bc7a-c2446f500c54" containerName="gather" Apr 24 22:50:49.352690 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:49.352681 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="3b896dfa-f45b-49b2-bc7a-c2446f500c54" containerName="copy" Apr 24 22:50:49.357182 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:49.357162 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-bfgnh" Apr 24 22:50:49.360979 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:49.360954 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h9lwj/perf-node-gather-daemonset-bfgnh"] Apr 24 22:50:49.538869 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:49.538840 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ccb92323-a9d0-4131-b1d6-1dcd97822147-proc\") pod \"perf-node-gather-daemonset-bfgnh\" (UID: \"ccb92323-a9d0-4131-b1d6-1dcd97822147\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-bfgnh" Apr 24 22:50:49.538869 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:49.538874 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmcbn\" (UniqueName: \"kubernetes.io/projected/ccb92323-a9d0-4131-b1d6-1dcd97822147-kube-api-access-qmcbn\") pod \"perf-node-gather-daemonset-bfgnh\" (UID: \"ccb92323-a9d0-4131-b1d6-1dcd97822147\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-bfgnh" Apr 24 22:50:49.539110 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:49.538897 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ccb92323-a9d0-4131-b1d6-1dcd97822147-sys\") pod \"perf-node-gather-daemonset-bfgnh\" (UID: \"ccb92323-a9d0-4131-b1d6-1dcd97822147\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-bfgnh" Apr 24 22:50:49.539110 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:49.538976 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ccb92323-a9d0-4131-b1d6-1dcd97822147-lib-modules\") pod \"perf-node-gather-daemonset-bfgnh\" (UID: \"ccb92323-a9d0-4131-b1d6-1dcd97822147\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-bfgnh" Apr 24 22:50:49.539110 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:49.539026 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ccb92323-a9d0-4131-b1d6-1dcd97822147-podres\") pod \"perf-node-gather-daemonset-bfgnh\" (UID: \"ccb92323-a9d0-4131-b1d6-1dcd97822147\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-bfgnh" Apr 24 22:50:49.640270 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:49.640244 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ccb92323-a9d0-4131-b1d6-1dcd97822147-proc\") pod \"perf-node-gather-daemonset-bfgnh\" (UID: \"ccb92323-a9d0-4131-b1d6-1dcd97822147\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-bfgnh" Apr 24 22:50:49.640270 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:49.640273 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qmcbn\" (UniqueName: \"kubernetes.io/projected/ccb92323-a9d0-4131-b1d6-1dcd97822147-kube-api-access-qmcbn\") pod \"perf-node-gather-daemonset-bfgnh\" (UID: \"ccb92323-a9d0-4131-b1d6-1dcd97822147\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-bfgnh" Apr 24 22:50:49.640490 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:49.640296 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ccb92323-a9d0-4131-b1d6-1dcd97822147-sys\") pod \"perf-node-gather-daemonset-bfgnh\" (UID: \"ccb92323-a9d0-4131-b1d6-1dcd97822147\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-bfgnh" Apr 24 22:50:49.640490 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:49.640324 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ccb92323-a9d0-4131-b1d6-1dcd97822147-lib-modules\") pod \"perf-node-gather-daemonset-bfgnh\" (UID: \"ccb92323-a9d0-4131-b1d6-1dcd97822147\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-bfgnh" Apr 24 22:50:49.640490 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:49.640350 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ccb92323-a9d0-4131-b1d6-1dcd97822147-podres\") pod \"perf-node-gather-daemonset-bfgnh\" (UID: \"ccb92323-a9d0-4131-b1d6-1dcd97822147\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-bfgnh" Apr 24 22:50:49.640490 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:49.640371 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ccb92323-a9d0-4131-b1d6-1dcd97822147-proc\") pod \"perf-node-gather-daemonset-bfgnh\" (UID: \"ccb92323-a9d0-4131-b1d6-1dcd97822147\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-bfgnh" Apr 24 22:50:49.640490 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:49.640383 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ccb92323-a9d0-4131-b1d6-1dcd97822147-sys\") pod \"perf-node-gather-daemonset-bfgnh\" (UID: \"ccb92323-a9d0-4131-b1d6-1dcd97822147\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-bfgnh" Apr 24 22:50:49.640791 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:49.640517 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ccb92323-a9d0-4131-b1d6-1dcd97822147-lib-modules\") pod \"perf-node-gather-daemonset-bfgnh\" (UID: \"ccb92323-a9d0-4131-b1d6-1dcd97822147\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-bfgnh" Apr 24 22:50:49.640791 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:49.640528 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ccb92323-a9d0-4131-b1d6-1dcd97822147-podres\") pod \"perf-node-gather-daemonset-bfgnh\" (UID: \"ccb92323-a9d0-4131-b1d6-1dcd97822147\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-bfgnh" Apr 24 22:50:49.649426 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:49.649395 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmcbn\" (UniqueName: \"kubernetes.io/projected/ccb92323-a9d0-4131-b1d6-1dcd97822147-kube-api-access-qmcbn\") pod \"perf-node-gather-daemonset-bfgnh\" (UID: \"ccb92323-a9d0-4131-b1d6-1dcd97822147\") " pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-bfgnh" Apr 24 22:50:49.669254 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:49.669229 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-bfgnh" Apr 24 22:50:49.792421 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:49.792393 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h9lwj/perf-node-gather-daemonset-bfgnh"] Apr 24 22:50:49.795189 ip-10-0-137-103 kubenswrapper[2572]: W0424 22:50:49.795161 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podccb92323_a9d0_4131_b1d6_1dcd97822147.slice/crio-630399de3c1c8a9c5f3f16f68b27204a28204bfc228fd6804cb86dfc7aff7bbe WatchSource:0}: Error finding container 630399de3c1c8a9c5f3f16f68b27204a28204bfc228fd6804cb86dfc7aff7bbe: Status 404 returned error can't find the container with id 630399de3c1c8a9c5f3f16f68b27204a28204bfc228fd6804cb86dfc7aff7bbe Apr 24 22:50:50.311239 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:50.311212 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vv7c4_1bbc078c-a636-438c-a64a-5bdfc1d13816/dns/0.log" Apr 24 22:50:50.329230 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:50.329203 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vv7c4_1bbc078c-a636-438c-a64a-5bdfc1d13816/kube-rbac-proxy/0.log" Apr 24 22:50:50.349902 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:50.349878 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-d44dl_e66ee62e-3c52-4dc3-b9b5-336aaebbd397/dns-node-resolver/0.log" Apr 24 22:50:50.614998 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:50.614915 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-bfgnh" event={"ID":"ccb92323-a9d0-4131-b1d6-1dcd97822147","Type":"ContainerStarted","Data":"393bb23b0236c2cc004b734685c6c120be851f73581a07d862219e6564f85f3f"} Apr 24 22:50:50.614998 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:50.614959 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-bfgnh" event={"ID":"ccb92323-a9d0-4131-b1d6-1dcd97822147","Type":"ContainerStarted","Data":"630399de3c1c8a9c5f3f16f68b27204a28204bfc228fd6804cb86dfc7aff7bbe"} Apr 24 22:50:50.615253 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:50.615057 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-bfgnh" Apr 24 22:50:50.634442 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:50.634403 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-bfgnh" podStartSLOduration=1.63438825 podStartE2EDuration="1.63438825s" podCreationTimestamp="2026-04-24 22:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:50:50.632776385 +0000 UTC m=+1257.781081935" watchObservedRunningTime="2026-04-24 22:50:50.63438825 +0000 UTC m=+1257.782693790" Apr 24 22:50:50.853895 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:50.853858 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qgt9r_1e6d3a55-047e-4c45-a923-7fa5ce12912c/node-ca/0.log" Apr 24 22:50:51.907297 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:51.907271 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8swwq_5d1b586a-9b17-4bf0-9aae-512fce173232/serve-healthcheck-canary/0.log" Apr 24 22:50:52.316482 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:52.316454 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c9c2z_8bfa88df-1892-46d6-b7cb-6e1a3ae2e536/kube-rbac-proxy/0.log" Apr 24 22:50:52.337984 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:52.337959 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c9c2z_8bfa88df-1892-46d6-b7cb-6e1a3ae2e536/exporter/0.log" Apr 24 22:50:52.357852 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:52.357830 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c9c2z_8bfa88df-1892-46d6-b7cb-6e1a3ae2e536/extractor/0.log" Apr 24 22:50:54.465285 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:54.465256 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-cz254_6cfa4b36-c34f-4afc-874b-e5c120a77e1c/server/0.log" Apr 24 22:50:54.575789 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:54.575765 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-px5c7_491b72ff-7459-4493-9992-dcf733d6c92e/s3-init/0.log" Apr 24 22:50:56.629049 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:56.628998 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-h9lwj/perf-node-gather-daemonset-bfgnh" Apr 24 22:50:59.703100 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:59.703072 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l57fj_91b6537a-5c00-4286-854b-be48eb427fe2/kube-multus-additional-cni-plugins/0.log" Apr 24 22:50:59.722141 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:59.722116 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l57fj_91b6537a-5c00-4286-854b-be48eb427fe2/egress-router-binary-copy/0.log" Apr 24 22:50:59.740065 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:59.740044 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l57fj_91b6537a-5c00-4286-854b-be48eb427fe2/cni-plugins/0.log" Apr 24 22:50:59.757339 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:59.757321 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l57fj_91b6537a-5c00-4286-854b-be48eb427fe2/bond-cni-plugin/0.log" Apr 24 22:50:59.775206 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:59.775186 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l57fj_91b6537a-5c00-4286-854b-be48eb427fe2/routeoverride-cni/0.log" Apr 24 22:50:59.792394 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:59.792376 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l57fj_91b6537a-5c00-4286-854b-be48eb427fe2/whereabouts-cni-bincopy/0.log" Apr 24 22:50:59.809665 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:50:59.809645 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l57fj_91b6537a-5c00-4286-854b-be48eb427fe2/whereabouts-cni/0.log" Apr 24 22:51:00.273723 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:51:00.273656 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-t9drx_d399b0dc-b56e-4c25-8058-11c529fe99f7/kube-multus/0.log" Apr 24 22:51:00.388529 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:51:00.388454 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-v4wwp_edbc33b8-02e4-43d1-a683-6dcd726340b7/network-metrics-daemon/0.log" Apr 24 22:51:00.407046 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:51:00.407023 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-v4wwp_edbc33b8-02e4-43d1-a683-6dcd726340b7/kube-rbac-proxy/0.log" Apr 24 22:51:01.668460 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:51:01.668437 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9trm_6126be90-f084-4237-b144-cdf6cef066c9/ovn-controller/0.log" Apr 24 22:51:01.691136 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:51:01.691113 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9trm_6126be90-f084-4237-b144-cdf6cef066c9/ovn-acl-logging/0.log" Apr 24 22:51:01.706543 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:51:01.706515 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9trm_6126be90-f084-4237-b144-cdf6cef066c9/kube-rbac-proxy-node/0.log" Apr 24 22:51:01.726669 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:51:01.726652 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9trm_6126be90-f084-4237-b144-cdf6cef066c9/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:51:01.749560 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:51:01.749531 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9trm_6126be90-f084-4237-b144-cdf6cef066c9/northd/0.log" Apr 24 22:51:01.768280 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:51:01.768258 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9trm_6126be90-f084-4237-b144-cdf6cef066c9/nbdb/0.log" Apr 24 22:51:01.789189 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:51:01.789165 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9trm_6126be90-f084-4237-b144-cdf6cef066c9/sbdb/0.log" Apr 24 22:51:01.896802 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:51:01.896782 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x9trm_6126be90-f084-4237-b144-cdf6cef066c9/ovnkube-controller/0.log" Apr 24 22:51:02.836524 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:51:02.836490 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-6q8zg_6a2536b6-b046-4594-8305-33498ed4dadd/network-check-target-container/0.log" Apr 24 22:51:03.763643 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:51:03.763617 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-rsk2h_05911b94-3317-4b23-b34f-f95d4552f61e/iptables-alerter/0.log" Apr 24 22:51:04.370582 ip-10-0-137-103 kubenswrapper[2572]: I0424 22:51:04.370560 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-4hbpm_221dffa5-7757-4e46-94b6-caf50b41f29e/tuned/0.log"