Apr 17 16:29:01.520741 ip-10-0-135-127 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 16:29:01.520756 ip-10-0-135-127 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 16:29:01.520765 ip-10-0-135-127 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 16:29:01.521077 ip-10-0-135-127 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 16:29:11.644586 ip-10-0-135-127 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 16:29:11.644605 ip-10-0-135-127 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 7009c3b6e0db40da8e03d472f931ad1c -- Apr 17 16:31:42.354536 ip-10-0-135-127 systemd[1]: Starting Kubernetes Kubelet... Apr 17 16:31:42.794835 ip-10-0-135-127 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:42.794835 ip-10-0-135-127 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 16:31:42.794835 ip-10-0-135-127 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:42.794835 ip-10-0-135-127 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 16:31:42.794835 ip-10-0-135-127 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:42.796460 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.796365 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 16:31:42.798647 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798631 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:42.798647 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798647 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:42.798707 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798651 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:42.798707 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798654 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:42.798707 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798657 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:42.798707 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798661 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:42.798707 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798663 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:42.798707 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798666 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:42.798707 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798669 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:42.798707 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798672 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:42.798707 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798675 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:42.798707 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798677 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:42.798707 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798680 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:42.798707 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798683 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:42.798707 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798685 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:42.798707 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798688 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:42.798707 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798691 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:42.798707 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798693 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:42.798707 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798697 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:42.798707 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798700 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:42.798707 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798703 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:42.798707 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798706 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:42.799192 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798708 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:42.799192 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798712 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:42.799192 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798715 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:42.799192 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798717 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:42.799192 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798721 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:42.799192 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798724 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:42.799192 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798726 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:42.799192 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798729 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:42.799192 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798732 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:42.799192 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798735 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:42.799192 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798738 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:42.799192 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798740 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:42.799192 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798743 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:42.799192 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798745 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:42.799192 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798748 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:42.799192 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798750 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:42.799192 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798753 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:42.799192 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798756 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:42.799192 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798758 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:42.799192 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798761 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:42.799702 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798763 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:42.799702 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798767 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:42.799702 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798769 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:42.799702 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798772 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:42.799702 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798774 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:42.799702 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798780 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:42.799702 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798784 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:42.799702 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798786 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:42.799702 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798789 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:42.799702 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798792 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:42.799702 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798795 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:42.799702 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798798 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:42.799702 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798801 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:42.799702 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798804 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:42.799702 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798807 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:42.799702 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798809 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:42.799702 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798812 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:42.799702 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798815 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:42.799702 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798819 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:42.800171 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798823 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:42.800171 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798826 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:42.800171 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798837 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:42.800171 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798841 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:42.800171 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798844 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:42.800171 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798847 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:42.800171 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798850 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:42.800171 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798853 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:42.800171 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798855 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:42.800171 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798858 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:42.800171 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798860 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:42.800171 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798863 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:42.800171 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798865 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:42.800171 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798868 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:42.800171 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798871 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:42.800171 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798874 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:42.800171 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798877 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:42.800171 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798880 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:42.800171 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798883 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:42.800640 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798885 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:42.800640 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798888 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:42.800640 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798891 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:42.800640 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798894 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:42.800640 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798896 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:42.800640 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.798899 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:42.801535 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801524 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:42.801535 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801534 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:42.801598 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801539 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:42.801598 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801542 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:42.801598 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801545 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:42.801598 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801548 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:42.801598 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801551 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:42.801598 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801553 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:42.801598 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801556 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:42.801598 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801559 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:42.801598 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801562 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:42.801598 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801564 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:42.801598 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801568 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:42.801598 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801571 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:42.801598 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801573 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:42.801598 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801576 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:42.801598 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801578 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:42.801598 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801581 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:42.801598 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801584 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:42.801598 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801586 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:42.801598 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801589 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:42.801598 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801591 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:42.802070 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801593 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:42.802070 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801597 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:42.802070 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801599 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:42.802070 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801602 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:42.802070 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801605 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:42.802070 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801607 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:42.802070 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801610 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:42.802070 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801613 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:42.802070 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801616 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:42.802070 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801618 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:42.802070 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801621 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:42.802070 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801626 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:42.802070 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801629 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:42.802070 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801633 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:42.802070 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801637 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:42.802070 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801640 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:42.802070 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801643 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:42.802070 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801646 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:42.802070 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801648 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:42.802557 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801650 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:42.802557 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801653 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:42.802557 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801655 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:42.802557 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801658 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:42.802557 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801661 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:42.802557 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801663 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:42.802557 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801666 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:42.802557 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801668 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:42.802557 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801671 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:42.802557 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801674 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:42.802557 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801676 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:42.802557 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801679 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:42.802557 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801681 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:42.802557 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801684 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:42.802557 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801687 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:42.802557 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801690 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:42.802557 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801693 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:42.802557 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801695 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:42.802557 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801698 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:42.803045 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801700 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:42.803045 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801703 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:42.803045 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801705 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:42.803045 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801708 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:42.803045 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801710 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:42.803045 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801713 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:42.803045 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801716 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:42.803045 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801718 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:42.803045 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801721 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:42.803045 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801723 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:42.803045 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801726 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:42.803045 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801728 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:42.803045 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801731 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:42.803045 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801734 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:42.803045 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801737 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:42.803045 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801739 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:42.803045 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801742 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:42.803045 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801744 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:42.803045 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801747 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:42.803045 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801750 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:42.803537 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801752 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:42.803537 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801755 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:42.803537 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801757 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:42.803537 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801760 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:42.803537 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801762 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:42.803537 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.801765 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:42.803537 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801837 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 16:31:42.803537 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801846 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 16:31:42.803537 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801855 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 16:31:42.803537 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801862 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 16:31:42.803537 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801868 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 16:31:42.803537 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801888 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 16:31:42.803537 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801894 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 16:31:42.803537 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801899 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 16:31:42.803537 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801903 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 16:31:42.803537 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801906 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 16:31:42.803537 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801910 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 16:31:42.803537 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801913 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 16:31:42.803537 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801916 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 16:31:42.803537 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801919 2569 flags.go:64] FLAG: --cgroup-root="" Apr 17 16:31:42.803537 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801922 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 16:31:42.803537 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801925 2569 flags.go:64] FLAG: --client-ca-file="" Apr 17 16:31:42.803537 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801928 2569 flags.go:64] FLAG: --cloud-config="" Apr 17 16:31:42.803537 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801931 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 17 16:31:42.804126 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801934 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 16:31:42.804126 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801939 2569 flags.go:64] FLAG: --cluster-domain="" Apr 17 16:31:42.804126 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801942 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 16:31:42.804126 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801946 2569 flags.go:64] FLAG: --config-dir="" Apr 17 16:31:42.804126 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801949 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 16:31:42.804126 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801953 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 16:31:42.804126 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801956 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 16:31:42.804126 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801960 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 16:31:42.804126 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801963 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 16:31:42.804126 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801967 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 16:31:42.804126 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801970 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 17 16:31:42.804126 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801972 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 16:31:42.804126 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801976 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 16:31:42.804126 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801979 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 16:31:42.804126 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801982 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 16:31:42.804126 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801987 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 16:31:42.804126 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801990 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 16:31:42.804126 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801993 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 16:31:42.804126 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801996 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 16:31:42.804126 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.801999 2569 flags.go:64] FLAG: --enable-server="true" Apr 17 16:31:42.804126 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802002 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 16:31:42.804126 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802007 2569 flags.go:64] FLAG: --event-burst="100" Apr 17 16:31:42.804126 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802010 2569 flags.go:64] FLAG: --event-qps="50" Apr 17 16:31:42.804126 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802013 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 16:31:42.804126 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802016 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 16:31:42.804762 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802019 2569 flags.go:64] FLAG: --eviction-hard="" Apr 17 16:31:42.804762 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802024 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 16:31:42.804762 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802027 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 16:31:42.804762 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802030 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 16:31:42.804762 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802033 2569 flags.go:64] FLAG: --eviction-soft="" Apr 17 16:31:42.804762 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802036 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 16:31:42.804762 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802039 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 16:31:42.804762 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802042 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 16:31:42.804762 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802045 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 16:31:42.804762 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802048 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 16:31:42.804762 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802051 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 16:31:42.804762 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802053 2569 flags.go:64] FLAG: --feature-gates="" Apr 17 16:31:42.804762 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802057 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 16:31:42.804762 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802060 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 16:31:42.804762 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802064 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 16:31:42.804762 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802068 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 16:31:42.804762 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802071 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 17 16:31:42.804762 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802074 2569 flags.go:64] FLAG: --help="false" Apr 17 16:31:42.804762 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802077 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-135-127.ec2.internal" Apr 17 16:31:42.804762 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802081 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 16:31:42.804762 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802084 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 16:31:42.804762 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802087 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 16:31:42.804762 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802091 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 16:31:42.804762 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802095 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 16:31:42.805408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802098 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 16:31:42.805408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802101 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 16:31:42.805408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802104 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 16:31:42.805408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802107 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 16:31:42.805408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802110 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 16:31:42.805408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802113 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 16:31:42.805408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802116 2569 flags.go:64] FLAG: --kube-reserved="" Apr 17 16:31:42.805408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802119 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 16:31:42.805408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802122 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 16:31:42.805408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802125 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 16:31:42.805408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802128 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 16:31:42.805408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802131 2569 flags.go:64] FLAG: --lock-file="" Apr 17 16:31:42.805408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802134 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 16:31:42.805408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802136 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 16:31:42.805408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802139 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 16:31:42.805408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802145 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 16:31:42.805408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802148 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 16:31:42.805408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802150 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 16:31:42.805408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802153 2569 flags.go:64] FLAG: --logging-format="text" Apr 17 16:31:42.805408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802156 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 16:31:42.805408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802159 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 16:31:42.805408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802162 2569 flags.go:64] FLAG: --manifest-url="" Apr 17 16:31:42.805408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802165 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 17 16:31:42.805408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802170 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 16:31:42.805408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802178 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 16:31:42.806060 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802183 2569 flags.go:64] FLAG: --max-pods="110" Apr 17 16:31:42.806060 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802186 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 16:31:42.806060 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802189 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 16:31:42.806060 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802193 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 16:31:42.806060 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802195 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 16:31:42.806060 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802199 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 16:31:42.806060 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802201 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 16:31:42.806060 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802205 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 16:31:42.806060 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802213 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 16:31:42.806060 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802216 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 16:31:42.806060 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802219 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 16:31:42.806060 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802222 2569 flags.go:64] FLAG: --pod-cidr="" Apr 17 16:31:42.806060 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802225 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 16:31:42.806060 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802231 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 16:31:42.806060 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802234 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 16:31:42.806060 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802237 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 17 16:31:42.806060 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802240 2569 flags.go:64] FLAG: --port="10250" Apr 17 16:31:42.806060 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802243 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 16:31:42.806060 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802258 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-02ff0a50eb1209c30" Apr 17 16:31:42.806060 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802262 2569 flags.go:64] FLAG: --qos-reserved="" Apr 17 16:31:42.806060 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802265 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 17 16:31:42.806060 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802268 2569 flags.go:64] FLAG: --register-node="true" Apr 17 16:31:42.806060 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802271 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 17 16:31:42.806060 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802274 2569 flags.go:64] FLAG: --register-with-taints="" Apr 17 16:31:42.806650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802286 2569 flags.go:64] FLAG: --registry-burst="10" Apr 17 16:31:42.806650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802289 2569 flags.go:64] FLAG: --registry-qps="5" Apr 17 16:31:42.806650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802292 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 17 16:31:42.806650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802295 2569 flags.go:64] FLAG: --reserved-memory="" Apr 17 16:31:42.806650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802303 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 16:31:42.806650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802306 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 16:31:42.806650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802310 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 16:31:42.806650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802313 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 16:31:42.806650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802316 2569 flags.go:64] FLAG: --runonce="false" Apr 17 16:31:42.806650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802320 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 16:31:42.806650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802323 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 16:31:42.806650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802326 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 17 16:31:42.806650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802329 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 16:31:42.806650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802331 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 16:31:42.806650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802334 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 16:31:42.806650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802338 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 16:31:42.806650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802341 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 16:31:42.806650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802344 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 16:31:42.806650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802347 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 16:31:42.806650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802349 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 16:31:42.806650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802352 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 16:31:42.806650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802355 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 16:31:42.806650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802359 2569 flags.go:64] FLAG: --system-cgroups="" Apr 17 16:31:42.806650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802362 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 16:31:42.806650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802367 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 16:31:42.807285 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802370 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 17 16:31:42.807285 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802373 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 16:31:42.807285 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802377 2569 flags.go:64] FLAG: --tls-min-version="" Apr 17 16:31:42.807285 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802380 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 16:31:42.807285 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802383 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 16:31:42.807285 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802386 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 16:31:42.807285 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802388 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 16:31:42.807285 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802392 2569 flags.go:64] FLAG: --v="2" Apr 17 16:31:42.807285 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802396 2569 flags.go:64] FLAG: --version="false" Apr 17 16:31:42.807285 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802400 2569 flags.go:64] FLAG: --vmodule="" Apr 17 16:31:42.807285 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802405 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 16:31:42.807285 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802409 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 16:31:42.807285 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802509 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:42.807285 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802512 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:42.807285 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802515 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:42.807285 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802518 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:42.807285 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802522 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:42.807285 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802524 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:42.807285 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802527 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:42.807285 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802529 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:42.807285 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802532 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:42.807285 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802535 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:42.807285 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802538 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:42.807865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802540 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:42.807865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802543 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:42.807865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802546 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:42.807865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802549 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:42.807865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802551 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:42.807865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802554 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:42.807865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802556 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:42.807865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802559 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:42.807865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802562 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:42.807865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802567 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:42.807865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802571 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:42.807865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802574 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:42.807865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802577 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:42.807865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802581 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:42.807865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802584 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:42.807865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802587 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:42.807865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802589 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:42.807865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802592 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:42.807865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802594 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:42.808371 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802597 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:42.808371 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802600 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:42.808371 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802602 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:42.808371 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802605 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:42.808371 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802608 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:42.808371 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802610 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:42.808371 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802613 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:42.808371 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802617 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:42.808371 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802619 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:42.808371 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802622 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:42.808371 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802624 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:42.808371 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802627 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:42.808371 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802629 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:42.808371 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802632 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:42.808371 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802636 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:42.808371 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802638 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:42.808371 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802641 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:42.808371 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802643 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:42.808371 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802646 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:42.808371 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802649 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:42.808865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802651 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:42.808865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802654 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:42.808865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802656 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:42.808865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802659 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:42.808865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802661 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:42.808865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802664 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:42.808865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802666 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:42.808865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802669 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:42.808865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802671 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:42.808865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802674 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:42.808865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802676 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:42.808865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802679 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:42.808865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802681 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:42.808865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802684 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:42.808865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802687 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:42.808865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802689 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:42.808865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802692 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:42.808865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802695 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:42.808865 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802698 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:42.809353 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802700 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:42.809353 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802703 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:42.809353 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802706 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:42.809353 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802709 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:42.809353 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802711 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:42.809353 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802714 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:42.809353 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802716 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:42.809353 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802720 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:42.809353 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802723 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:42.809353 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802725 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:42.809353 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802728 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:42.809353 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802730 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:42.809353 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802733 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:42.809353 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802735 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:42.809353 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802737 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:42.809353 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802740 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:42.809353 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.802742 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:42.809767 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.802752 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:42.810544 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.810524 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 16:31:42.810582 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.810545 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 16:31:42.810610 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810594 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:42.810610 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810600 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:42.810610 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810604 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:42.810610 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810607 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:42.810610 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810611 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:42.810735 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810614 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:42.810735 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810617 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:42.810735 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810621 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:42.810735 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810624 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:42.810735 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810627 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:42.810735 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810630 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:42.810735 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810633 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:42.810735 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810636 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:42.810735 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810639 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:42.810735 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810642 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:42.810735 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810645 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:42.810735 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810647 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:42.810735 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810650 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:42.810735 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810653 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:42.810735 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810656 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:42.810735 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810658 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:42.810735 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810661 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:42.810735 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810663 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:42.810735 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810666 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:42.810735 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810669 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:42.811243 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810671 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:42.811243 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810674 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:42.811243 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810677 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:42.811243 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810679 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:42.811243 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810682 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:42.811243 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810684 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:42.811243 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810688 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:42.811243 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810690 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:42.811243 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810693 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:42.811243 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810695 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:42.811243 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810698 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:42.811243 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810700 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:42.811243 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810703 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:42.811243 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810707 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:42.811243 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810710 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:42.811243 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810713 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:42.811243 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810716 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:42.811243 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810718 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:42.811243 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810721 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:42.811243 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810723 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:42.811806 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810726 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:42.811806 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810729 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:42.811806 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810731 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:42.811806 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810734 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:42.811806 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810736 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:42.811806 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810739 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:42.811806 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810742 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:42.811806 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810745 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:42.811806 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810748 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:42.811806 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810751 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:42.811806 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810753 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:42.811806 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810756 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:42.811806 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810759 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:42.811806 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810761 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:42.811806 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810764 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:42.811806 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810766 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:42.811806 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810769 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:42.811806 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810771 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:42.811806 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810774 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:42.812331 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810777 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:42.812331 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810780 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:42.812331 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810783 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:42.812331 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810785 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:42.812331 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810788 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:42.812331 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810790 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:42.812331 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810794 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:42.812331 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810797 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:42.812331 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810799 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:42.812331 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810802 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:42.812331 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810806 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:42.812331 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810810 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:42.812331 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810814 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:42.812331 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810818 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:42.812331 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810821 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:42.812331 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810824 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:42.812331 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810834 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:42.812331 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810839 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:42.812331 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810842 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:42.812789 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810845 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:42.812789 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810848 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:42.812789 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810850 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:42.812789 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.810855 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:42.812789 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810955 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:42.812789 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810960 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:42.812789 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810963 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:42.812789 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810966 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:42.812789 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810968 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:42.812789 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810971 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:42.812789 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810974 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:42.812789 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810977 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:42.812789 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810979 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:42.812789 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810982 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:42.812789 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810985 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:42.813162 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810988 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:42.813162 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810990 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:42.813162 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810993 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:42.813162 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810996 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:42.813162 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.810999 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:42.813162 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811002 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:42.813162 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811004 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:42.813162 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811007 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:42.813162 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811010 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:42.813162 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811012 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:42.813162 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811015 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:42.813162 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811017 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:42.813162 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811020 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:42.813162 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811022 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:42.813162 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811025 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:42.813162 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811028 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:42.813162 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811030 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:42.813162 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811033 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:42.813162 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811035 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:42.813162 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811038 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:42.813662 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811040 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:42.813662 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811044 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:42.813662 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811047 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:42.813662 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811050 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:42.813662 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811053 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:42.813662 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811056 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:42.813662 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811058 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:42.813662 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811061 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:42.813662 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811063 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:42.813662 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811066 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:42.813662 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811068 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:42.813662 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811071 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:42.813662 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811073 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:42.813662 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811076 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:42.813662 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811078 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:42.813662 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811081 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:42.813662 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811083 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:42.813662 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811086 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:42.813662 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811089 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:42.814129 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811092 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:42.814129 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811094 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:42.814129 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811097 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:42.814129 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811099 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:42.814129 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811102 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:42.814129 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811105 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:42.814129 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811107 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:42.814129 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811110 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:42.814129 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811112 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:42.814129 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811115 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:42.814129 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811118 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:42.814129 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811120 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:42.814129 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811124 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:42.814129 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811127 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:42.814129 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811130 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:42.814129 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811132 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:42.814129 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811135 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:42.814129 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811138 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:42.814129 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811140 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:42.814129 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811143 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:42.814631 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811145 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:42.814631 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811147 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:42.814631 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811150 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:42.814631 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811152 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:42.814631 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811155 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:42.814631 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811157 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:42.814631 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811160 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:42.814631 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811163 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:42.814631 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811166 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:42.814631 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811169 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:42.814631 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811171 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:42.814631 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811173 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:42.814631 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811176 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:42.814631 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811178 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:42.814631 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811181 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:42.814631 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:42.811183 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:42.815037 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.811188 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:42.815037 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.811968 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 16:31:42.815506 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.815492 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 16:31:42.816445 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.816434 2569 server.go:1019] "Starting client certificate rotation" Apr 17 16:31:42.816540 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.816527 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:31:42.816574 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.816566 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:31:42.844420 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.844392 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:31:42.848197 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.848176 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:31:42.863752 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.863731 2569 log.go:25] "Validated CRI v1 runtime API" Apr 17 16:31:42.868913 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.868897 2569 log.go:25] "Validated CRI v1 image API" Apr 17 16:31:42.870181 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.870151 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 16:31:42.871729 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.871714 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:31:42.873147 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.873124 2569 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 7acc03b8-7562-44fe-8cf1-2497ac96a78a:/dev/nvme0n1p4 976bbe59-b221-4627-98fe-fe2450bffe4a:/dev/nvme0n1p3] Apr 17 16:31:42.873201 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.873148 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 16:31:42.879590 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.879279 2569 manager.go:217] Machine: {Timestamp:2026-04-17 16:31:42.877719908 +0000 UTC m=+0.400745497 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099342 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23ff4947af9fc85b0b67a38c72625e SystemUUID:ec23ff49-47af-9fc8-5b0b-67a38c72625e BootID:7009c3b6-e0db-40da-8e03-d472f931ad1c Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:01:10:28:d6:d5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:01:10:28:d6:d5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5a:b7:58:ef:f3:cc Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 16:31:42.879590 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.879556 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 16:31:42.879695 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.879647 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 16:31:42.881969 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.881945 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 16:31:42.882101 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.881970 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-127.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 16:31:42.882575 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.882564 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 16:31:42.882606 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.882578 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 16:31:42.882606 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.882591 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:31:42.883242 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.883232 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:31:42.884066 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.884057 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:31:42.884168 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.884160 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 16:31:42.886977 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.886967 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 17 16:31:42.887009 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.886980 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 16:31:42.887009 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.886993 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 16:31:42.887009 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.887001 2569 kubelet.go:397] "Adding apiserver pod source" Apr 17 16:31:42.887102 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.887010 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 16:31:42.888043 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.888032 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:31:42.888081 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.888049 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:31:42.891441 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.891424 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 16:31:42.892740 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.892727 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 16:31:42.894563 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.894549 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 16:31:42.894609 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.894573 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 16:31:42.894609 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.894583 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 16:31:42.894609 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.894592 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 16:31:42.894609 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.894602 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 16:31:42.894710 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.894610 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 16:31:42.894710 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.894619 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 16:31:42.894710 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.894626 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 16:31:42.894710 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.894633 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 16:31:42.894710 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.894639 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 16:31:42.894710 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.894648 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 16:31:42.894710 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.894657 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 16:31:42.896465 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.896452 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 16:31:42.896504 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.896467 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 16:31:42.898914 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:42.898759 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-127.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 16:31:42.898993 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:42.898733 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 16:31:42.899129 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.899113 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-tt9lh" Apr 17 16:31:42.900020 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.900007 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 16:31:42.900073 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.900046 2569 server.go:1295] "Started kubelet" Apr 17 16:31:42.900159 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.900124 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 16:31:42.900245 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.900207 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 16:31:42.900293 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.900275 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 16:31:42.900875 ip-10-0-135-127 systemd[1]: Started Kubernetes Kubelet. Apr 17 16:31:42.901447 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.901430 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 17 16:31:42.901485 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.901452 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 16:31:42.907185 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.907159 2569 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-127.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 16:31:42.907185 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.907182 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-tt9lh" Apr 17 16:31:42.907411 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.907397 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 16:31:42.907469 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.907402 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 16:31:42.908279 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:42.907207 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-127.ec2.internal.18a731f431e800be default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-127.ec2.internal,UID:ip-10-0-135-127.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-127.ec2.internal,},FirstTimestamp:2026-04-17 16:31:42.90001939 +0000 UTC m=+0.423044964,LastTimestamp:2026-04-17 16:31:42.90001939 +0000 UTC m=+0.423044964,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-127.ec2.internal,}" Apr 17 16:31:42.908867 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.908847 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 16:31:42.908975 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.908964 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 16:31:42.909077 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.908995 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 16:31:42.909136 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:42.909017 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 16:31:42.909225 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.909207 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 17 16:31:42.909306 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.909234 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 17 16:31:42.910174 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:42.910081 2569 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 16:31:42.910288 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.910215 2569 factory.go:55] Registering systemd factory Apr 17 16:31:42.910288 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.910239 2569 factory.go:223] Registration of the systemd container factory successfully Apr 17 16:31:42.912017 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.912000 2569 factory.go:153] Registering CRI-O factory Apr 17 16:31:42.912099 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.912023 2569 factory.go:223] Registration of the crio container factory successfully Apr 17 16:31:42.912391 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.912348 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 16:31:42.912513 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.912501 2569 factory.go:103] Registering Raw factory Apr 17 16:31:42.912596 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.912588 2569 manager.go:1196] Started watching for new ooms in manager Apr 17 16:31:42.912858 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.912833 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:42.913107 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.913077 2569 manager.go:319] Starting recovery of all containers Apr 17 16:31:42.916296 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:42.916269 2569 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-135-127.ec2.internal\" not found" node="ip-10-0-135-127.ec2.internal" Apr 17 16:31:42.923828 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.923814 2569 manager.go:324] Recovery completed Apr 17 16:31:42.928056 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.928045 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:42.930324 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.930308 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:42.930388 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.930337 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:42.930388 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.930350 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:42.930827 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.930812 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 16:31:42.930902 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.930827 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 16:31:42.930902 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.930854 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:31:42.933645 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.933629 2569 policy_none.go:49] "None policy: Start" Apr 17 16:31:42.933728 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.933650 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 16:31:42.933728 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.933664 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 17 16:31:42.965804 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.965788 2569 manager.go:341] "Starting Device Plugin manager" Apr 17 16:31:42.982435 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:42.965853 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 16:31:42.982435 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.965867 2569 server.go:85] "Starting device plugin registration server" Apr 17 16:31:42.982435 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.966108 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 16:31:42.982435 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.966121 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 16:31:42.982435 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.966213 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 16:31:42.982435 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.966326 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 16:31:42.982435 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:42.966335 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 16:31:42.982435 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:42.966992 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 16:31:42.982435 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:42.967030 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 16:31:43.035144 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.035110 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 16:31:43.036294 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.036268 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 16:31:43.036294 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.036295 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 16:31:43.036435 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.036315 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 16:31:43.036435 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.036324 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 16:31:43.036435 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:43.036356 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 16:31:43.038957 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.038934 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:43.066900 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.066880 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:43.068022 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.068005 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:43.068120 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.068038 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:43.068120 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.068054 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:43.068120 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.068083 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-127.ec2.internal" Apr 17 16:31:43.076617 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.076598 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-127.ec2.internal" Apr 17 16:31:43.076705 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:43.076623 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-127.ec2.internal\": node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 16:31:43.093987 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:43.093962 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 16:31:43.137502 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.137415 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-135-127.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal"] Apr 17 16:31:43.137612 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.137508 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:43.138556 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.138538 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:43.138648 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.138571 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:43.138648 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.138586 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:43.140109 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.140094 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:43.140280 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.140265 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-127.ec2.internal" Apr 17 16:31:43.140330 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.140294 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:43.140796 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.140779 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:43.140871 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.140811 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:43.140871 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.140827 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:43.140871 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.140780 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:43.141007 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.140881 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:43.141007 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.140895 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:43.142098 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.142082 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal" Apr 17 16:31:43.142150 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.142115 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:43.142744 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.142729 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:43.142808 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.142754 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:43.142808 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.142767 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:43.162794 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:43.162772 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-127.ec2.internal\" not found" node="ip-10-0-135-127.ec2.internal" Apr 17 16:31:43.166541 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:43.166523 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-127.ec2.internal\" not found" node="ip-10-0-135-127.ec2.internal" Apr 17 16:31:43.194550 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:43.194534 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 16:31:43.210925 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.210906 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/356446819b043d77b4ba2d5504f23404-config\") pod \"kube-apiserver-proxy-ip-10-0-135-127.ec2.internal\" (UID: \"356446819b043d77b4ba2d5504f23404\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-127.ec2.internal" Apr 17 16:31:43.210991 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.210933 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d5d09bbd1af6f808e94311449d7cd444-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal\" (UID: \"d5d09bbd1af6f808e94311449d7cd444\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal" Apr 17 16:31:43.210991 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.210952 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5d09bbd1af6f808e94311449d7cd444-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal\" (UID: \"d5d09bbd1af6f808e94311449d7cd444\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal" Apr 17 16:31:43.294982 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:43.294946 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 16:31:43.311354 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.311327 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/356446819b043d77b4ba2d5504f23404-config\") pod \"kube-apiserver-proxy-ip-10-0-135-127.ec2.internal\" (UID: \"356446819b043d77b4ba2d5504f23404\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-127.ec2.internal" Apr 17 16:31:43.311461 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.311362 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d5d09bbd1af6f808e94311449d7cd444-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal\" (UID: \"d5d09bbd1af6f808e94311449d7cd444\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal" Apr 17 16:31:43.311461 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.311390 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5d09bbd1af6f808e94311449d7cd444-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal\" (UID: \"d5d09bbd1af6f808e94311449d7cd444\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal" Apr 17 16:31:43.311461 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.311438 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5d09bbd1af6f808e94311449d7cd444-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal\" (UID: \"d5d09bbd1af6f808e94311449d7cd444\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal" Apr 17 16:31:43.311585 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.311436 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d5d09bbd1af6f808e94311449d7cd444-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal\" (UID: \"d5d09bbd1af6f808e94311449d7cd444\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal" Apr 17 16:31:43.311585 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.311435 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/356446819b043d77b4ba2d5504f23404-config\") pod \"kube-apiserver-proxy-ip-10-0-135-127.ec2.internal\" (UID: \"356446819b043d77b4ba2d5504f23404\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-127.ec2.internal" Apr 17 16:31:43.395540 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:43.395459 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 16:31:43.464967 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.464929 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-127.ec2.internal" Apr 17 16:31:43.469640 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.469617 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal" Apr 17 16:31:43.496438 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:43.496414 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 16:31:43.596907 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:43.596867 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 16:31:43.697467 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:43.697390 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 16:31:43.743102 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.743072 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:43.797701 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:43.797664 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 16:31:43.816131 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.816106 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 16:31:43.816335 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.816314 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:31:43.816401 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.816330 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:31:43.816401 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.816317 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:31:43.898719 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:43.898671 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 16:31:43.907851 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.907825 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 16:31:43.909581 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.909552 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 16:26:42 +0000 UTC" deadline="2027-11-04 14:45:07.332944608 +0000 UTC" Apr 17 16:31:43.909581 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.909578 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13582h13m23.423369036s" Apr 17 16:31:43.927387 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.927359 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:31:43.937815 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:43.937788 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod356446819b043d77b4ba2d5504f23404.slice/crio-5952e82d3f25f3c5e689848d9864c5c7674eb3520fb3696ffd6966671a7e3b59 WatchSource:0}: Error finding container 5952e82d3f25f3c5e689848d9864c5c7674eb3520fb3696ffd6966671a7e3b59: Status 404 returned error can't find the container with id 5952e82d3f25f3c5e689848d9864c5c7674eb3520fb3696ffd6966671a7e3b59 Apr 17 16:31:43.938201 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:43.938184 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5d09bbd1af6f808e94311449d7cd444.slice/crio-27894caeeb973b2119440a6c49143af6f38e3b72cb034a9d9e3acabddf87d62b WatchSource:0}: Error finding container 27894caeeb973b2119440a6c49143af6f38e3b72cb034a9d9e3acabddf87d62b: Status 404 returned error can't find the container with id 27894caeeb973b2119440a6c49143af6f38e3b72cb034a9d9e3acabddf87d62b Apr 17 16:31:43.942848 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.942824 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:31:43.948809 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.948767 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-h6nwp" Apr 17 16:31:43.959970 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:43.959952 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-h6nwp" Apr 17 16:31:43.999133 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:43.999107 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 16:31:44.039646 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.039596 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-127.ec2.internal" event={"ID":"356446819b043d77b4ba2d5504f23404","Type":"ContainerStarted","Data":"5952e82d3f25f3c5e689848d9864c5c7674eb3520fb3696ffd6966671a7e3b59"} Apr 17 16:31:44.040589 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.040566 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal" event={"ID":"d5d09bbd1af6f808e94311449d7cd444","Type":"ContainerStarted","Data":"27894caeeb973b2119440a6c49143af6f38e3b72cb034a9d9e3acabddf87d62b"} Apr 17 16:31:44.099778 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:44.099744 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 16:31:44.200330 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:44.200245 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 16:31:44.203805 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.203785 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:44.209759 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.209737 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-127.ec2.internal" Apr 17 16:31:44.220600 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.220583 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:31:44.221496 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.221484 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal" Apr 17 16:31:44.231308 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.231291 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:31:44.888058 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.888025 2569 apiserver.go:52] "Watching apiserver" Apr 17 16:31:44.897130 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.897105 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 16:31:44.897524 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.897500 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-7rxj5","openshift-multus/multus-5n4q5","openshift-multus/multus-additional-cni-plugins-k4c9b","openshift-multus/network-metrics-daemon-vtq9t","openshift-network-operator/iptables-alerter-c4d5f","kube-system/konnectivity-agent-jp52r","kube-system/kube-apiserver-proxy-ip-10-0-135-127.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv","openshift-image-registry/node-ca-hch47","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal","openshift-network-diagnostics/network-check-target-hdwf7","openshift-ovn-kubernetes/ovnkube-node-79ft9"] Apr 17 16:31:44.899797 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.899741 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:44.901083 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.901063 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5n4q5" Apr 17 16:31:44.901961 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.901929 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-7vn4z\"" Apr 17 16:31:44.901961 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.901936 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:31:44.902262 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.902231 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 16:31:44.903026 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.903008 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 16:31:44.903508 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.903399 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 16:31:44.903508 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.903455 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 16:31:44.903508 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.903465 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-8p95g\"" Apr 17 16:31:44.903508 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.903496 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 16:31:44.903835 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.903819 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-k4c9b" Apr 17 16:31:44.905193 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.905140 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:31:44.905312 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:44.905218 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtq9t" podUID="4666c56f-3d86-4e16-a782-6a41f0fe8825" Apr 17 16:31:44.908282 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.908047 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-c4d5f" Apr 17 16:31:44.908282 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.908067 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 16:31:44.908282 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.908159 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-krbhr\"" Apr 17 16:31:44.908459 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.908422 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 16:31:44.910206 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.910028 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jp52r" Apr 17 16:31:44.910576 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.910554 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-pfgg7\"" Apr 17 16:31:44.910668 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.910589 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:31:44.910736 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.910685 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 16:31:44.911002 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.910981 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 16:31:44.911590 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.911572 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" Apr 17 16:31:44.912143 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.912125 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 16:31:44.912443 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.912402 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-xjsmz\"" Apr 17 16:31:44.912779 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.912703 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 16:31:44.913853 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.913835 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 16:31:44.914301 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.914002 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 16:31:44.914301 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.914064 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 16:31:44.914301 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.914109 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-85c7f\"" Apr 17 16:31:44.914301 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.914281 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hch47" Apr 17 16:31:44.914547 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.914388 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:31:44.914547 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:44.914448 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hdwf7" podUID="290ef757-149c-497a-85e3-cc6a8cd8fc45" Apr 17 16:31:44.915598 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.915579 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:44.916635 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.916590 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 16:31:44.916813 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.916793 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 16:31:44.916887 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.916827 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 16:31:44.916926 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.916889 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-2tf2s\"" Apr 17 16:31:44.918050 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.918034 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 16:31:44.919039 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.919016 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 16:31:44.919442 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.919305 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 16:31:44.919442 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.919321 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 16:31:44.919442 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.919339 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 16:31:44.919442 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.919360 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-thqqx\"" Apr 17 16:31:44.919442 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.919310 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 16:31:44.919828 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.919808 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-system-cni-dir\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:44.919887 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.919843 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-host-var-lib-kubelet\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:44.919887 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.919869 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdks2\" (UniqueName: \"kubernetes.io/projected/15557662-26a5-4d16-b9d6-e301ff3e11c6-kube-api-access-wdks2\") pod \"multus-additional-cni-plugins-k4c9b\" (UID: \"15557662-26a5-4d16-b9d6-e301ff3e11c6\") " pod="openshift-multus/multus-additional-cni-plugins-k4c9b" Apr 17 16:31:44.920021 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.919895 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-etc-sysctl-d\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:44.920021 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.919920 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-etc-sysctl-conf\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:44.920021 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.919946 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-os-release\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:44.920021 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.919995 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-host-var-lib-cni-multus\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:44.920169 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920030 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6a215469-2ba6-4a12-bd40-a197844067ed-multus-daemon-config\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:44.920169 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920057 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/15557662-26a5-4d16-b9d6-e301ff3e11c6-cni-binary-copy\") pod \"multus-additional-cni-plugins-k4c9b\" (UID: \"15557662-26a5-4d16-b9d6-e301ff3e11c6\") " pod="openshift-multus/multus-additional-cni-plugins-k4c9b" Apr 17 16:31:44.920169 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920087 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4666c56f-3d86-4e16-a782-6a41f0fe8825-metrics-certs\") pod \"network-metrics-daemon-vtq9t\" (UID: \"4666c56f-3d86-4e16-a782-6a41f0fe8825\") " pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:31:44.920169 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920109 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-var-lib-kubelet\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:44.920169 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920134 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-multus-cni-dir\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:44.920442 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920171 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-host-run-k8s-cni-cncf-io\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:44.920442 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920204 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-hostroot\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:44.920442 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920232 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/15557662-26a5-4d16-b9d6-e301ff3e11c6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k4c9b\" (UID: \"15557662-26a5-4d16-b9d6-e301ff3e11c6\") " pod="openshift-multus/multus-additional-cni-plugins-k4c9b" Apr 17 16:31:44.920442 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920272 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7051a978-dd0e-480e-93f4-b48b1dda0f32-konnectivity-ca\") pod \"konnectivity-agent-jp52r\" (UID: \"7051a978-dd0e-480e-93f4-b48b1dda0f32\") " pod="kube-system/konnectivity-agent-jp52r" Apr 17 16:31:44.920442 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920296 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-etc-kubernetes\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:44.920442 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920331 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-etc-systemd\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:44.920442 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920372 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/deef8b97-d137-4d1d-b5bf-258429691ce3-tmp\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:44.920442 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920397 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-multus-socket-dir-parent\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:44.920442 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920424 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-multus-conf-dir\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:44.920784 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920450 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-etc-kubernetes\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:44.920784 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920490 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/15557662-26a5-4d16-b9d6-e301ff3e11c6-os-release\") pod \"multus-additional-cni-plugins-k4c9b\" (UID: \"15557662-26a5-4d16-b9d6-e301ff3e11c6\") " pod="openshift-multus/multus-additional-cni-plugins-k4c9b" Apr 17 16:31:44.920784 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920518 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/15557662-26a5-4d16-b9d6-e301ff3e11c6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k4c9b\" (UID: \"15557662-26a5-4d16-b9d6-e301ff3e11c6\") " pod="openshift-multus/multus-additional-cni-plugins-k4c9b" Apr 17 16:31:44.920784 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920542 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/15557662-26a5-4d16-b9d6-e301ff3e11c6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-k4c9b\" (UID: \"15557662-26a5-4d16-b9d6-e301ff3e11c6\") " pod="openshift-multus/multus-additional-cni-plugins-k4c9b" Apr 17 16:31:44.920784 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920565 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-host\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:44.920784 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920614 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/deef8b97-d137-4d1d-b5bf-258429691ce3-etc-tuned\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:44.920784 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920638 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-cnibin\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:44.920784 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920662 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-host-run-multus-certs\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:44.920784 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920697 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/15557662-26a5-4d16-b9d6-e301ff3e11c6-cnibin\") pod \"multus-additional-cni-plugins-k4c9b\" (UID: \"15557662-26a5-4d16-b9d6-e301ff3e11c6\") " pod="openshift-multus/multus-additional-cni-plugins-k4c9b" Apr 17 16:31:44.920784 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920731 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwwf6\" (UniqueName: \"kubernetes.io/projected/4666c56f-3d86-4e16-a782-6a41f0fe8825-kube-api-access-fwwf6\") pod \"network-metrics-daemon-vtq9t\" (UID: \"4666c56f-3d86-4e16-a782-6a41f0fe8825\") " pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:31:44.920784 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920774 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtfwr\" (UniqueName: \"kubernetes.io/projected/963aea58-ae9e-49da-b049-4fd51933dfd1-kube-api-access-gtfwr\") pod \"iptables-alerter-c4d5f\" (UID: \"963aea58-ae9e-49da-b049-4fd51933dfd1\") " pod="openshift-network-operator/iptables-alerter-c4d5f" Apr 17 16:31:44.921209 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920797 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-run\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:44.921209 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920830 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-host-var-lib-cni-bin\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:44.921209 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920876 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/963aea58-ae9e-49da-b049-4fd51933dfd1-host-slash\") pod \"iptables-alerter-c4d5f\" (UID: \"963aea58-ae9e-49da-b049-4fd51933dfd1\") " pod="openshift-network-operator/iptables-alerter-c4d5f" Apr 17 16:31:44.921209 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920905 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-etc-modprobe-d\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:44.921209 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920933 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-sys\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:44.921209 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920956 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdznz\" (UniqueName: \"kubernetes.io/projected/deef8b97-d137-4d1d-b5bf-258429691ce3-kube-api-access-kdznz\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:44.921209 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.920983 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-host-run-netns\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:44.921209 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.921008 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/15557662-26a5-4d16-b9d6-e301ff3e11c6-system-cni-dir\") pod \"multus-additional-cni-plugins-k4c9b\" (UID: \"15557662-26a5-4d16-b9d6-e301ff3e11c6\") " pod="openshift-multus/multus-additional-cni-plugins-k4c9b" Apr 17 16:31:44.921209 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.921031 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-etc-sysconfig\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:44.921209 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.921053 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-lib-modules\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:44.921209 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.921076 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6a215469-2ba6-4a12-bd40-a197844067ed-cni-binary-copy\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:44.921209 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.921098 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfzlh\" (UniqueName: \"kubernetes.io/projected/6a215469-2ba6-4a12-bd40-a197844067ed-kube-api-access-jfzlh\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:44.921209 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.921136 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/963aea58-ae9e-49da-b049-4fd51933dfd1-iptables-alerter-script\") pod \"iptables-alerter-c4d5f\" (UID: \"963aea58-ae9e-49da-b049-4fd51933dfd1\") " pod="openshift-network-operator/iptables-alerter-c4d5f" Apr 17 16:31:44.921209 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.921158 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7051a978-dd0e-480e-93f4-b48b1dda0f32-agent-certs\") pod \"konnectivity-agent-jp52r\" (UID: \"7051a978-dd0e-480e-93f4-b48b1dda0f32\") " pod="kube-system/konnectivity-agent-jp52r" Apr 17 16:31:44.960605 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.960576 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:26:43 +0000 UTC" deadline="2027-11-08 22:04:06.069702701 +0000 UTC" Apr 17 16:31:44.960605 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:44.960605 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13685h32m21.109102085s" Apr 17 16:31:45.010383 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.010359 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 16:31:45.022155 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022127 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-host-run-multus-certs\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.022299 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022161 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtfwr\" (UniqueName: \"kubernetes.io/projected/963aea58-ae9e-49da-b049-4fd51933dfd1-kube-api-access-gtfwr\") pod \"iptables-alerter-c4d5f\" (UID: \"963aea58-ae9e-49da-b049-4fd51933dfd1\") " pod="openshift-network-operator/iptables-alerter-c4d5f" Apr 17 16:31:45.022299 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022181 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-run\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.022410 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022390 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-host-run-multus-certs\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.022461 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022448 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-node-log\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.022509 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022469 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-run\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.022509 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022486 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/97b20cee-5673-4b39-a3f9-105d0d794713-ovnkube-script-lib\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.022578 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022525 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-host-var-lib-cni-bin\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.022578 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022545 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/963aea58-ae9e-49da-b049-4fd51933dfd1-host-slash\") pod \"iptables-alerter-c4d5f\" (UID: \"963aea58-ae9e-49da-b049-4fd51933dfd1\") " pod="openshift-network-operator/iptables-alerter-c4d5f" Apr 17 16:31:45.022578 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022562 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqfw5\" (UniqueName: \"kubernetes.io/projected/e4b20a08-0520-48d1-bce6-26fcc1371d10-kube-api-access-cqfw5\") pod \"node-ca-hch47\" (UID: \"e4b20a08-0520-48d1-bce6-26fcc1371d10\") " pod="openshift-image-registry/node-ca-hch47" Apr 17 16:31:45.022578 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022564 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-host-var-lib-cni-bin\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.022732 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022584 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-host-slash\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.022732 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022609 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp2cr\" (UniqueName: \"kubernetes.io/projected/97b20cee-5673-4b39-a3f9-105d0d794713-kube-api-access-pp2cr\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.022732 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022616 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/963aea58-ae9e-49da-b049-4fd51933dfd1-host-slash\") pod \"iptables-alerter-c4d5f\" (UID: \"963aea58-ae9e-49da-b049-4fd51933dfd1\") " pod="openshift-network-operator/iptables-alerter-c4d5f" Apr 17 16:31:45.022732 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022637 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-etc-sysconfig\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.022732 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022658 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-etc-openvswitch\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.022732 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022681 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-run-openvswitch\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.022732 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022707 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.022732 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022720 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-etc-sysconfig\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.022732 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022733 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/97b20cee-5673-4b39-a3f9-105d0d794713-env-overrides\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.023166 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022754 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-system-cni-dir\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.023166 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022775 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-host-var-lib-kubelet\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.023166 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022796 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-etc-sysctl-d\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.023166 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022849 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-host-var-lib-kubelet\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.023166 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022842 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-system-cni-dir\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.023166 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022870 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn6cw\" (UniqueName: \"kubernetes.io/projected/290ef757-149c-497a-85e3-cc6a8cd8fc45-kube-api-access-sn6cw\") pod \"network-check-target-hdwf7\" (UID: \"290ef757-149c-497a-85e3-cc6a8cd8fc45\") " pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:31:45.023166 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022896 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-var-lib-openvswitch\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.023166 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022910 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/97b20cee-5673-4b39-a3f9-105d0d794713-ovn-node-metrics-cert\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.023166 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022936 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-var-lib-kubelet\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.023166 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022955 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-etc-sysctl-d\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.023166 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022963 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e4b20a08-0520-48d1-bce6-26fcc1371d10-serviceca\") pod \"node-ca-hch47\" (UID: \"e4b20a08-0520-48d1-bce6-26fcc1371d10\") " pod="openshift-image-registry/node-ca-hch47" Apr 17 16:31:45.023166 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022990 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-run-ovn\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.023166 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.022998 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-var-lib-kubelet\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.023166 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023015 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-host-run-k8s-cni-cncf-io\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.023166 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023040 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-etc-kubernetes\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.023166 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023059 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdznz\" (UniqueName: \"kubernetes.io/projected/deef8b97-d137-4d1d-b5bf-258429691ce3-kube-api-access-kdznz\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.023166 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023074 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-host-cni-netd\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.024007 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023083 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-host-run-k8s-cni-cncf-io\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.024007 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023097 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-multus-conf-dir\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.024007 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023120 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-etc-kubernetes\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.024007 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023120 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/15557662-26a5-4d16-b9d6-e301ff3e11c6-os-release\") pod \"multus-additional-cni-plugins-k4c9b\" (UID: \"15557662-26a5-4d16-b9d6-e301ff3e11c6\") " pod="openshift-multus/multus-additional-cni-plugins-k4c9b" Apr 17 16:31:45.024007 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023153 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/15557662-26a5-4d16-b9d6-e301ff3e11c6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k4c9b\" (UID: \"15557662-26a5-4d16-b9d6-e301ff3e11c6\") " pod="openshift-multus/multus-additional-cni-plugins-k4c9b" Apr 17 16:31:45.024007 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023180 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/15557662-26a5-4d16-b9d6-e301ff3e11c6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-k4c9b\" (UID: \"15557662-26a5-4d16-b9d6-e301ff3e11c6\") " pod="openshift-multus/multus-additional-cni-plugins-k4c9b" Apr 17 16:31:45.024007 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023189 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/15557662-26a5-4d16-b9d6-e301ff3e11c6-os-release\") pod \"multus-additional-cni-plugins-k4c9b\" (UID: \"15557662-26a5-4d16-b9d6-e301ff3e11c6\") " pod="openshift-multus/multus-additional-cni-plugins-k4c9b" Apr 17 16:31:45.024007 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023215 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4b20a08-0520-48d1-bce6-26fcc1371d10-host\") pod \"node-ca-hch47\" (UID: \"e4b20a08-0520-48d1-bce6-26fcc1371d10\") " pod="openshift-image-registry/node-ca-hch47" Apr 17 16:31:45.024007 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023240 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-host-run-netns\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.024007 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023263 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-multus-conf-dir\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.024007 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023279 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-cnibin\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.024007 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023296 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/15557662-26a5-4d16-b9d6-e301ff3e11c6-cnibin\") pod \"multus-additional-cni-plugins-k4c9b\" (UID: \"15557662-26a5-4d16-b9d6-e301ff3e11c6\") " pod="openshift-multus/multus-additional-cni-plugins-k4c9b" Apr 17 16:31:45.024007 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023331 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwwf6\" (UniqueName: \"kubernetes.io/projected/4666c56f-3d86-4e16-a782-6a41f0fe8825-kube-api-access-fwwf6\") pod \"network-metrics-daemon-vtq9t\" (UID: \"4666c56f-3d86-4e16-a782-6a41f0fe8825\") " pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:31:45.024007 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023352 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/27d18268-52ee-45c5-b488-31fa18c8328d-device-dir\") pod \"aws-ebs-csi-driver-node-r5nzv\" (UID: \"27d18268-52ee-45c5-b488-31fa18c8328d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" Apr 17 16:31:45.024007 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023391 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-etc-modprobe-d\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.024007 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023406 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-sys\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.024007 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023424 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/15557662-26a5-4d16-b9d6-e301ff3e11c6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k4c9b\" (UID: \"15557662-26a5-4d16-b9d6-e301ff3e11c6\") " pod="openshift-multus/multus-additional-cni-plugins-k4c9b" Apr 17 16:31:45.024821 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023433 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-host-cni-bin\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.024821 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023467 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/97b20cee-5673-4b39-a3f9-105d0d794713-ovnkube-config\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.024821 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023495 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-host-run-netns\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.024821 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023499 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-cnibin\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.024821 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023520 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/15557662-26a5-4d16-b9d6-e301ff3e11c6-system-cni-dir\") pod \"multus-additional-cni-plugins-k4c9b\" (UID: \"15557662-26a5-4d16-b9d6-e301ff3e11c6\") " pod="openshift-multus/multus-additional-cni-plugins-k4c9b" Apr 17 16:31:45.024821 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023557 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/15557662-26a5-4d16-b9d6-e301ff3e11c6-system-cni-dir\") pod \"multus-additional-cni-plugins-k4c9b\" (UID: \"15557662-26a5-4d16-b9d6-e301ff3e11c6\") " pod="openshift-multus/multus-additional-cni-plugins-k4c9b" Apr 17 16:31:45.024821 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023558 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-lib-modules\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.024821 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023595 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27d18268-52ee-45c5-b488-31fa18c8328d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r5nzv\" (UID: \"27d18268-52ee-45c5-b488-31fa18c8328d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" Apr 17 16:31:45.024821 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023604 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-sys\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.024821 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023621 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/27d18268-52ee-45c5-b488-31fa18c8328d-socket-dir\") pod \"aws-ebs-csi-driver-node-r5nzv\" (UID: \"27d18268-52ee-45c5-b488-31fa18c8328d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" Apr 17 16:31:45.024821 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023636 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-host-run-netns\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.024821 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023648 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6a215469-2ba6-4a12-bd40-a197844067ed-cni-binary-copy\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.024821 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023685 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-lib-modules\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.024821 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023767 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jfzlh\" (UniqueName: \"kubernetes.io/projected/6a215469-2ba6-4a12-bd40-a197844067ed-kube-api-access-jfzlh\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.024821 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023795 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/963aea58-ae9e-49da-b049-4fd51933dfd1-iptables-alerter-script\") pod \"iptables-alerter-c4d5f\" (UID: \"963aea58-ae9e-49da-b049-4fd51933dfd1\") " pod="openshift-network-operator/iptables-alerter-c4d5f" Apr 17 16:31:45.024821 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023801 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/15557662-26a5-4d16-b9d6-e301ff3e11c6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-k4c9b\" (UID: \"15557662-26a5-4d16-b9d6-e301ff3e11c6\") " pod="openshift-multus/multus-additional-cni-plugins-k4c9b" Apr 17 16:31:45.024821 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023831 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/15557662-26a5-4d16-b9d6-e301ff3e11c6-cnibin\") pod \"multus-additional-cni-plugins-k4c9b\" (UID: \"15557662-26a5-4d16-b9d6-e301ff3e11c6\") " pod="openshift-multus/multus-additional-cni-plugins-k4c9b" Apr 17 16:31:45.025577 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023859 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7051a978-dd0e-480e-93f4-b48b1dda0f32-agent-certs\") pod \"konnectivity-agent-jp52r\" (UID: \"7051a978-dd0e-480e-93f4-b48b1dda0f32\") " pod="kube-system/konnectivity-agent-jp52r" Apr 17 16:31:45.025577 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023953 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-etc-modprobe-d\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.025577 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.023965 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bc7t\" (UniqueName: \"kubernetes.io/projected/27d18268-52ee-45c5-b488-31fa18c8328d-kube-api-access-2bc7t\") pod \"aws-ebs-csi-driver-node-r5nzv\" (UID: \"27d18268-52ee-45c5-b488-31fa18c8328d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" Apr 17 16:31:45.025577 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024061 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wdks2\" (UniqueName: \"kubernetes.io/projected/15557662-26a5-4d16-b9d6-e301ff3e11c6-kube-api-access-wdks2\") pod \"multus-additional-cni-plugins-k4c9b\" (UID: \"15557662-26a5-4d16-b9d6-e301ff3e11c6\") " pod="openshift-multus/multus-additional-cni-plugins-k4c9b" Apr 17 16:31:45.025577 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024091 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-etc-sysctl-conf\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.025577 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024120 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/27d18268-52ee-45c5-b488-31fa18c8328d-etc-selinux\") pod \"aws-ebs-csi-driver-node-r5nzv\" (UID: \"27d18268-52ee-45c5-b488-31fa18c8328d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" Apr 17 16:31:45.025577 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024147 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-host-kubelet\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.025577 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024172 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-run-systemd\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.025577 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024203 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-host-run-ovn-kubernetes\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.025577 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024227 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-etc-sysctl-conf\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.025577 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024231 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-os-release\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.025577 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024239 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 16:31:45.025577 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024290 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-host-var-lib-cni-multus\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.025577 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024311 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-os-release\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.025577 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024315 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6a215469-2ba6-4a12-bd40-a197844067ed-multus-daemon-config\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.025577 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024324 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/963aea58-ae9e-49da-b049-4fd51933dfd1-iptables-alerter-script\") pod \"iptables-alerter-c4d5f\" (UID: \"963aea58-ae9e-49da-b049-4fd51933dfd1\") " pod="openshift-network-operator/iptables-alerter-c4d5f" Apr 17 16:31:45.025577 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024341 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/15557662-26a5-4d16-b9d6-e301ff3e11c6-cni-binary-copy\") pod \"multus-additional-cni-plugins-k4c9b\" (UID: \"15557662-26a5-4d16-b9d6-e301ff3e11c6\") " pod="openshift-multus/multus-additional-cni-plugins-k4c9b" Apr 17 16:31:45.026187 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024362 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4666c56f-3d86-4e16-a782-6a41f0fe8825-metrics-certs\") pod \"network-metrics-daemon-vtq9t\" (UID: \"4666c56f-3d86-4e16-a782-6a41f0fe8825\") " pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:31:45.026187 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024367 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-host-var-lib-cni-multus\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.026187 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024382 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/deef8b97-d137-4d1d-b5bf-258429691ce3-tmp\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.026187 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024382 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6a215469-2ba6-4a12-bd40-a197844067ed-cni-binary-copy\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.026187 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024403 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-multus-cni-dir\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.026187 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024427 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-hostroot\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.026187 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024453 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/15557662-26a5-4d16-b9d6-e301ff3e11c6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k4c9b\" (UID: \"15557662-26a5-4d16-b9d6-e301ff3e11c6\") " pod="openshift-multus/multus-additional-cni-plugins-k4c9b" Apr 17 16:31:45.026187 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024478 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7051a978-dd0e-480e-93f4-b48b1dda0f32-konnectivity-ca\") pod \"konnectivity-agent-jp52r\" (UID: \"7051a978-dd0e-480e-93f4-b48b1dda0f32\") " pod="kube-system/konnectivity-agent-jp52r" Apr 17 16:31:45.026187 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024484 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-multus-cni-dir\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.026187 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:45.024494 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:45.026187 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024503 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-etc-systemd\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.026187 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024530 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/27d18268-52ee-45c5-b488-31fa18c8328d-sys-fs\") pod \"aws-ebs-csi-driver-node-r5nzv\" (UID: \"27d18268-52ee-45c5-b488-31fa18c8328d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" Apr 17 16:31:45.026187 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:45.024555 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4666c56f-3d86-4e16-a782-6a41f0fe8825-metrics-certs podName:4666c56f-3d86-4e16-a782-6a41f0fe8825 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:45.524533945 +0000 UTC m=+3.047559522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4666c56f-3d86-4e16-a782-6a41f0fe8825-metrics-certs") pod "network-metrics-daemon-vtq9t" (UID: "4666c56f-3d86-4e16-a782-6a41f0fe8825") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:45.026187 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024589 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-systemd-units\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.026187 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024625 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-multus-socket-dir-parent\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.026187 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024672 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-etc-kubernetes\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.026187 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024697 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-host\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.026892 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024724 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/deef8b97-d137-4d1d-b5bf-258429691ce3-etc-tuned\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.026892 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024750 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/27d18268-52ee-45c5-b488-31fa18c8328d-registration-dir\") pod \"aws-ebs-csi-driver-node-r5nzv\" (UID: \"27d18268-52ee-45c5-b488-31fa18c8328d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" Apr 17 16:31:45.026892 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024777 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-log-socket\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.026892 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024823 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/15557662-26a5-4d16-b9d6-e301ff3e11c6-cni-binary-copy\") pod \"multus-additional-cni-plugins-k4c9b\" (UID: \"15557662-26a5-4d16-b9d6-e301ff3e11c6\") " pod="openshift-multus/multus-additional-cni-plugins-k4c9b" Apr 17 16:31:45.026892 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024865 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6a215469-2ba6-4a12-bd40-a197844067ed-multus-daemon-config\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.026892 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024885 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-multus-socket-dir-parent\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.026892 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024888 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-hostroot\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.026892 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024940 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a215469-2ba6-4a12-bd40-a197844067ed-etc-kubernetes\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.026892 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024971 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/15557662-26a5-4d16-b9d6-e301ff3e11c6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k4c9b\" (UID: \"15557662-26a5-4d16-b9d6-e301ff3e11c6\") " pod="openshift-multus/multus-additional-cni-plugins-k4c9b" Apr 17 16:31:45.026892 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.024990 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-etc-systemd\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.026892 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.025039 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/deef8b97-d137-4d1d-b5bf-258429691ce3-host\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.026892 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.025068 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7051a978-dd0e-480e-93f4-b48b1dda0f32-konnectivity-ca\") pod \"konnectivity-agent-jp52r\" (UID: \"7051a978-dd0e-480e-93f4-b48b1dda0f32\") " pod="kube-system/konnectivity-agent-jp52r" Apr 17 16:31:45.027733 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.027514 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/deef8b97-d137-4d1d-b5bf-258429691ce3-tmp\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.027838 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.027582 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/deef8b97-d137-4d1d-b5bf-258429691ce3-etc-tuned\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.027838 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.027804 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7051a978-dd0e-480e-93f4-b48b1dda0f32-agent-certs\") pod \"konnectivity-agent-jp52r\" (UID: \"7051a978-dd0e-480e-93f4-b48b1dda0f32\") " pod="kube-system/konnectivity-agent-jp52r" Apr 17 16:31:45.031086 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.031061 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtfwr\" (UniqueName: \"kubernetes.io/projected/963aea58-ae9e-49da-b049-4fd51933dfd1-kube-api-access-gtfwr\") pod \"iptables-alerter-c4d5f\" (UID: \"963aea58-ae9e-49da-b049-4fd51933dfd1\") " pod="openshift-network-operator/iptables-alerter-c4d5f" Apr 17 16:31:45.031890 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.031843 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdznz\" (UniqueName: \"kubernetes.io/projected/deef8b97-d137-4d1d-b5bf-258429691ce3-kube-api-access-kdznz\") pod \"tuned-7rxj5\" (UID: \"deef8b97-d137-4d1d-b5bf-258429691ce3\") " pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.032472 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.032445 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwwf6\" (UniqueName: \"kubernetes.io/projected/4666c56f-3d86-4e16-a782-6a41f0fe8825-kube-api-access-fwwf6\") pod \"network-metrics-daemon-vtq9t\" (UID: \"4666c56f-3d86-4e16-a782-6a41f0fe8825\") " pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:31:45.033445 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.033424 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfzlh\" (UniqueName: \"kubernetes.io/projected/6a215469-2ba6-4a12-bd40-a197844067ed-kube-api-access-jfzlh\") pod \"multus-5n4q5\" (UID: \"6a215469-2ba6-4a12-bd40-a197844067ed\") " pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.033546 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.033530 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdks2\" (UniqueName: \"kubernetes.io/projected/15557662-26a5-4d16-b9d6-e301ff3e11c6-kube-api-access-wdks2\") pod \"multus-additional-cni-plugins-k4c9b\" (UID: \"15557662-26a5-4d16-b9d6-e301ff3e11c6\") " pod="openshift-multus/multus-additional-cni-plugins-k4c9b" Apr 17 16:31:45.126041 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126009 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/27d18268-52ee-45c5-b488-31fa18c8328d-registration-dir\") pod \"aws-ebs-csi-driver-node-r5nzv\" (UID: \"27d18268-52ee-45c5-b488-31fa18c8328d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" Apr 17 16:31:45.126224 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126053 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-log-socket\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.126224 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126107 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-node-log\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.126224 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126134 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/97b20cee-5673-4b39-a3f9-105d0d794713-ovnkube-script-lib\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.126224 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126156 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqfw5\" (UniqueName: \"kubernetes.io/projected/e4b20a08-0520-48d1-bce6-26fcc1371d10-kube-api-access-cqfw5\") pod \"node-ca-hch47\" (UID: \"e4b20a08-0520-48d1-bce6-26fcc1371d10\") " pod="openshift-image-registry/node-ca-hch47" Apr 17 16:31:45.126224 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126163 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/27d18268-52ee-45c5-b488-31fa18c8328d-registration-dir\") pod \"aws-ebs-csi-driver-node-r5nzv\" (UID: \"27d18268-52ee-45c5-b488-31fa18c8328d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" Apr 17 16:31:45.126224 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126178 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-host-slash\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.126224 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126195 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pp2cr\" (UniqueName: \"kubernetes.io/projected/97b20cee-5673-4b39-a3f9-105d0d794713-kube-api-access-pp2cr\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.126224 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126211 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-node-log\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.126633 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126225 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-log-socket\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.126633 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126272 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-host-slash\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.126633 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126283 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-etc-openvswitch\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.126633 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126314 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-run-openvswitch\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.126633 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126355 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-run-openvswitch\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.126633 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126355 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-etc-openvswitch\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.126633 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126378 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.126633 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126410 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/97b20cee-5673-4b39-a3f9-105d0d794713-env-overrides\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.126633 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126440 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn6cw\" (UniqueName: \"kubernetes.io/projected/290ef757-149c-497a-85e3-cc6a8cd8fc45-kube-api-access-sn6cw\") pod \"network-check-target-hdwf7\" (UID: \"290ef757-149c-497a-85e3-cc6a8cd8fc45\") " pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:31:45.126633 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126462 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.126633 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126511 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-var-lib-openvswitch\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.126633 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126537 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/97b20cee-5673-4b39-a3f9-105d0d794713-ovn-node-metrics-cert\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.126633 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126567 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e4b20a08-0520-48d1-bce6-26fcc1371d10-serviceca\") pod \"node-ca-hch47\" (UID: \"e4b20a08-0520-48d1-bce6-26fcc1371d10\") " pod="openshift-image-registry/node-ca-hch47" Apr 17 16:31:45.126633 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126579 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-var-lib-openvswitch\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.126633 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126594 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-run-ovn\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.126633 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126622 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-host-cni-netd\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.127359 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126653 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4b20a08-0520-48d1-bce6-26fcc1371d10-host\") pod \"node-ca-hch47\" (UID: \"e4b20a08-0520-48d1-bce6-26fcc1371d10\") " pod="openshift-image-registry/node-ca-hch47" Apr 17 16:31:45.127359 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126671 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-run-ovn\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.127359 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126704 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-host-run-netns\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.127359 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126716 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-host-cni-netd\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.127359 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126732 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/27d18268-52ee-45c5-b488-31fa18c8328d-device-dir\") pod \"aws-ebs-csi-driver-node-r5nzv\" (UID: \"27d18268-52ee-45c5-b488-31fa18c8328d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" Apr 17 16:31:45.127359 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126753 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4b20a08-0520-48d1-bce6-26fcc1371d10-host\") pod \"node-ca-hch47\" (UID: \"e4b20a08-0520-48d1-bce6-26fcc1371d10\") " pod="openshift-image-registry/node-ca-hch47" Apr 17 16:31:45.127359 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126759 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-host-cni-bin\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.127359 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126793 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/97b20cee-5673-4b39-a3f9-105d0d794713-ovnkube-config\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.127359 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126816 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/97b20cee-5673-4b39-a3f9-105d0d794713-env-overrides\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.127359 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126823 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27d18268-52ee-45c5-b488-31fa18c8328d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r5nzv\" (UID: \"27d18268-52ee-45c5-b488-31fa18c8328d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" Apr 17 16:31:45.127359 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126844 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/27d18268-52ee-45c5-b488-31fa18c8328d-device-dir\") pod \"aws-ebs-csi-driver-node-r5nzv\" (UID: \"27d18268-52ee-45c5-b488-31fa18c8328d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" Apr 17 16:31:45.127359 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126795 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-host-run-netns\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.127359 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126861 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/27d18268-52ee-45c5-b488-31fa18c8328d-socket-dir\") pod \"aws-ebs-csi-driver-node-r5nzv\" (UID: \"27d18268-52ee-45c5-b488-31fa18c8328d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" Apr 17 16:31:45.127359 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126863 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/97b20cee-5673-4b39-a3f9-105d0d794713-ovnkube-script-lib\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.127359 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126895 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bc7t\" (UniqueName: \"kubernetes.io/projected/27d18268-52ee-45c5-b488-31fa18c8328d-kube-api-access-2bc7t\") pod \"aws-ebs-csi-driver-node-r5nzv\" (UID: \"27d18268-52ee-45c5-b488-31fa18c8328d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" Apr 17 16:31:45.127359 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126945 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27d18268-52ee-45c5-b488-31fa18c8328d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r5nzv\" (UID: \"27d18268-52ee-45c5-b488-31fa18c8328d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" Apr 17 16:31:45.127359 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126947 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/27d18268-52ee-45c5-b488-31fa18c8328d-etc-selinux\") pod \"aws-ebs-csi-driver-node-r5nzv\" (UID: \"27d18268-52ee-45c5-b488-31fa18c8328d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" Apr 17 16:31:45.127841 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126896 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-host-cni-bin\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.127841 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.126990 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-host-kubelet\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.127841 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.127002 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/27d18268-52ee-45c5-b488-31fa18c8328d-etc-selinux\") pod \"aws-ebs-csi-driver-node-r5nzv\" (UID: \"27d18268-52ee-45c5-b488-31fa18c8328d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" Apr 17 16:31:45.127841 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.127004 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/27d18268-52ee-45c5-b488-31fa18c8328d-socket-dir\") pod \"aws-ebs-csi-driver-node-r5nzv\" (UID: \"27d18268-52ee-45c5-b488-31fa18c8328d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" Apr 17 16:31:45.127841 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.127019 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-run-systemd\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.127841 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.127046 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-host-run-ovn-kubernetes\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.127841 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.127052 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e4b20a08-0520-48d1-bce6-26fcc1371d10-serviceca\") pod \"node-ca-hch47\" (UID: \"e4b20a08-0520-48d1-bce6-26fcc1371d10\") " pod="openshift-image-registry/node-ca-hch47" Apr 17 16:31:45.127841 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.127050 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-host-kubelet\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.127841 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.127050 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-run-systemd\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.127841 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.127090 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-host-run-ovn-kubernetes\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.127841 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.127098 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/27d18268-52ee-45c5-b488-31fa18c8328d-sys-fs\") pod \"aws-ebs-csi-driver-node-r5nzv\" (UID: \"27d18268-52ee-45c5-b488-31fa18c8328d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" Apr 17 16:31:45.127841 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.127133 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-systemd-units\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.127841 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.127141 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/27d18268-52ee-45c5-b488-31fa18c8328d-sys-fs\") pod \"aws-ebs-csi-driver-node-r5nzv\" (UID: \"27d18268-52ee-45c5-b488-31fa18c8328d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" Apr 17 16:31:45.127841 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.127192 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/97b20cee-5673-4b39-a3f9-105d0d794713-systemd-units\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.127841 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.127272 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/97b20cee-5673-4b39-a3f9-105d0d794713-ovnkube-config\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.129342 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.129323 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/97b20cee-5673-4b39-a3f9-105d0d794713-ovn-node-metrics-cert\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.137955 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:45.137931 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:45.137955 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:45.137957 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:45.138207 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:45.137971 2569 projected.go:194] Error preparing data for projected volume kube-api-access-sn6cw for pod openshift-network-diagnostics/network-check-target-hdwf7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:45.138207 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:45.138048 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/290ef757-149c-497a-85e3-cc6a8cd8fc45-kube-api-access-sn6cw podName:290ef757-149c-497a-85e3-cc6a8cd8fc45 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:45.638030165 +0000 UTC m=+3.161055747 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-sn6cw" (UniqueName: "kubernetes.io/projected/290ef757-149c-497a-85e3-cc6a8cd8fc45-kube-api-access-sn6cw") pod "network-check-target-hdwf7" (UID: "290ef757-149c-497a-85e3-cc6a8cd8fc45") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:45.139942 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.139910 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp2cr\" (UniqueName: \"kubernetes.io/projected/97b20cee-5673-4b39-a3f9-105d0d794713-kube-api-access-pp2cr\") pod \"ovnkube-node-79ft9\" (UID: \"97b20cee-5673-4b39-a3f9-105d0d794713\") " pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.140025 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.139912 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bc7t\" (UniqueName: \"kubernetes.io/projected/27d18268-52ee-45c5-b488-31fa18c8328d-kube-api-access-2bc7t\") pod \"aws-ebs-csi-driver-node-r5nzv\" (UID: \"27d18268-52ee-45c5-b488-31fa18c8328d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" Apr 17 16:31:45.143088 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.143071 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqfw5\" (UniqueName: \"kubernetes.io/projected/e4b20a08-0520-48d1-bce6-26fcc1371d10-kube-api-access-cqfw5\") pod \"node-ca-hch47\" (UID: \"e4b20a08-0520-48d1-bce6-26fcc1371d10\") " pod="openshift-image-registry/node-ca-hch47" Apr 17 16:31:45.211898 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.211862 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" Apr 17 16:31:45.219500 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.219475 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5n4q5" Apr 17 16:31:45.228192 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.228171 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-k4c9b" Apr 17 16:31:45.228742 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.228599 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:45.231767 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.231747 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-c4d5f" Apr 17 16:31:45.238336 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.238319 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jp52r" Apr 17 16:31:45.245351 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.245334 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" Apr 17 16:31:45.253031 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.253011 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hch47" Apr 17 16:31:45.258649 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.258628 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:31:45.356764 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.356729 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:45.528854 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:45.528828 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97b20cee_5673_4b39_a3f9_105d0d794713.slice/crio-3a7ff64fc7aae3d6ad0052e3f60f100ab86ef736650950aa1775097f46f246c4 WatchSource:0}: Error finding container 3a7ff64fc7aae3d6ad0052e3f60f100ab86ef736650950aa1775097f46f246c4: Status 404 returned error can't find the container with id 3a7ff64fc7aae3d6ad0052e3f60f100ab86ef736650950aa1775097f46f246c4 Apr 17 16:31:45.530124 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.530086 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-2wqxs"] Apr 17 16:31:45.530487 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:45.530458 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a215469_2ba6_4a12_bd40_a197844067ed.slice/crio-15395e4d17b2cde9e3ed819ca930f680e8eac0c05e2a189635e3513f63b7025b WatchSource:0}: Error finding container 15395e4d17b2cde9e3ed819ca930f680e8eac0c05e2a189635e3513f63b7025b: Status 404 returned error can't find the container with id 15395e4d17b2cde9e3ed819ca930f680e8eac0c05e2a189635e3513f63b7025b Apr 17 16:31:45.531065 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.531039 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4666c56f-3d86-4e16-a782-6a41f0fe8825-metrics-certs\") pod \"network-metrics-daemon-vtq9t\" (UID: \"4666c56f-3d86-4e16-a782-6a41f0fe8825\") " pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:31:45.531309 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:45.531239 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:45.531371 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:45.531326 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4666c56f-3d86-4e16-a782-6a41f0fe8825-metrics-certs podName:4666c56f-3d86-4e16-a782-6a41f0fe8825 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:46.531305467 +0000 UTC m=+4.054331034 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4666c56f-3d86-4e16-a782-6a41f0fe8825-metrics-certs") pod "network-metrics-daemon-vtq9t" (UID: "4666c56f-3d86-4e16-a782-6a41f0fe8825") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:45.534672 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.533695 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2wqxs" Apr 17 16:31:45.537079 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.536848 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 16:31:45.537079 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.536896 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 16:31:45.537567 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.537408 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-25cmm\"" Apr 17 16:31:45.538519 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:45.538493 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27d18268_52ee_45c5_b488_31fa18c8328d.slice/crio-55200da3de1c836b7826808ea5f32a59390f5dbda74cc1591a14e3f511d5bb8a WatchSource:0}: Error finding container 55200da3de1c836b7826808ea5f32a59390f5dbda74cc1591a14e3f511d5bb8a: Status 404 returned error can't find the container with id 55200da3de1c836b7826808ea5f32a59390f5dbda74cc1591a14e3f511d5bb8a Apr 17 16:31:45.540903 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:45.540796 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4b20a08_0520_48d1_bce6_26fcc1371d10.slice/crio-ba3722c01f0316c5c8a961534b82ef8c61d0fdf7a46a761105fa6999b65f2262 WatchSource:0}: Error finding container ba3722c01f0316c5c8a961534b82ef8c61d0fdf7a46a761105fa6999b65f2262: Status 404 returned error can't find the container with id ba3722c01f0316c5c8a961534b82ef8c61d0fdf7a46a761105fa6999b65f2262 Apr 17 16:31:45.543126 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:45.543101 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7051a978_dd0e_480e_93f4_b48b1dda0f32.slice/crio-6cac9000eaa872b1c52c130714011c7f3caa50f088c34e846c4260202eaed23f WatchSource:0}: Error finding container 6cac9000eaa872b1c52c130714011c7f3caa50f088c34e846c4260202eaed23f: Status 404 returned error can't find the container with id 6cac9000eaa872b1c52c130714011c7f3caa50f088c34e846c4260202eaed23f Apr 17 16:31:45.543400 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:45.543365 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddeef8b97_d137_4d1d_b5bf_258429691ce3.slice/crio-7b500be821f449ff5dbc3c6688c0a1e46b585eb77b9da82b6c2a9cc65174fecf WatchSource:0}: Error finding container 7b500be821f449ff5dbc3c6688c0a1e46b585eb77b9da82b6c2a9cc65174fecf: Status 404 returned error can't find the container with id 7b500be821f449ff5dbc3c6688c0a1e46b585eb77b9da82b6c2a9cc65174fecf Apr 17 16:31:45.631411 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.631383 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2be0d6b4-a7ac-45cf-80dc-e5427b8f1559-hosts-file\") pod \"node-resolver-2wqxs\" (UID: \"2be0d6b4-a7ac-45cf-80dc-e5427b8f1559\") " pod="openshift-dns/node-resolver-2wqxs" Apr 17 16:31:45.631530 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.631417 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2be0d6b4-a7ac-45cf-80dc-e5427b8f1559-tmp-dir\") pod \"node-resolver-2wqxs\" (UID: \"2be0d6b4-a7ac-45cf-80dc-e5427b8f1559\") " pod="openshift-dns/node-resolver-2wqxs" Apr 17 16:31:45.631530 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.631519 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmdp5\" (UniqueName: \"kubernetes.io/projected/2be0d6b4-a7ac-45cf-80dc-e5427b8f1559-kube-api-access-tmdp5\") pod \"node-resolver-2wqxs\" (UID: \"2be0d6b4-a7ac-45cf-80dc-e5427b8f1559\") " pod="openshift-dns/node-resolver-2wqxs" Apr 17 16:31:45.732714 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.732678 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn6cw\" (UniqueName: \"kubernetes.io/projected/290ef757-149c-497a-85e3-cc6a8cd8fc45-kube-api-access-sn6cw\") pod \"network-check-target-hdwf7\" (UID: \"290ef757-149c-497a-85e3-cc6a8cd8fc45\") " pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:31:45.732910 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.732722 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2be0d6b4-a7ac-45cf-80dc-e5427b8f1559-hosts-file\") pod \"node-resolver-2wqxs\" (UID: \"2be0d6b4-a7ac-45cf-80dc-e5427b8f1559\") " pod="openshift-dns/node-resolver-2wqxs" Apr 17 16:31:45.732910 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.732747 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2be0d6b4-a7ac-45cf-80dc-e5427b8f1559-tmp-dir\") pod \"node-resolver-2wqxs\" (UID: \"2be0d6b4-a7ac-45cf-80dc-e5427b8f1559\") " pod="openshift-dns/node-resolver-2wqxs" Apr 17 16:31:45.732910 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.732783 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmdp5\" (UniqueName: \"kubernetes.io/projected/2be0d6b4-a7ac-45cf-80dc-e5427b8f1559-kube-api-access-tmdp5\") pod \"node-resolver-2wqxs\" (UID: \"2be0d6b4-a7ac-45cf-80dc-e5427b8f1559\") " pod="openshift-dns/node-resolver-2wqxs" Apr 17 16:31:45.732910 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:45.732817 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:45.732910 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:45.732838 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:45.732910 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:45.732854 2569 projected.go:194] Error preparing data for projected volume kube-api-access-sn6cw for pod openshift-network-diagnostics/network-check-target-hdwf7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:45.732910 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.732852 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2be0d6b4-a7ac-45cf-80dc-e5427b8f1559-hosts-file\") pod \"node-resolver-2wqxs\" (UID: \"2be0d6b4-a7ac-45cf-80dc-e5427b8f1559\") " pod="openshift-dns/node-resolver-2wqxs" Apr 17 16:31:45.732910 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:45.732905 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/290ef757-149c-497a-85e3-cc6a8cd8fc45-kube-api-access-sn6cw podName:290ef757-149c-497a-85e3-cc6a8cd8fc45 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:46.732888145 +0000 UTC m=+4.255913711 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-sn6cw" (UniqueName: "kubernetes.io/projected/290ef757-149c-497a-85e3-cc6a8cd8fc45-kube-api-access-sn6cw") pod "network-check-target-hdwf7" (UID: "290ef757-149c-497a-85e3-cc6a8cd8fc45") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:45.733274 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.733104 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2be0d6b4-a7ac-45cf-80dc-e5427b8f1559-tmp-dir\") pod \"node-resolver-2wqxs\" (UID: \"2be0d6b4-a7ac-45cf-80dc-e5427b8f1559\") " pod="openshift-dns/node-resolver-2wqxs" Apr 17 16:31:45.742843 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.742810 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmdp5\" (UniqueName: \"kubernetes.io/projected/2be0d6b4-a7ac-45cf-80dc-e5427b8f1559-kube-api-access-tmdp5\") pod \"node-resolver-2wqxs\" (UID: \"2be0d6b4-a7ac-45cf-80dc-e5427b8f1559\") " pod="openshift-dns/node-resolver-2wqxs" Apr 17 16:31:45.846380 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.845925 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2wqxs" Apr 17 16:31:45.854272 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:31:45.854226 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2be0d6b4_a7ac_45cf_80dc_e5427b8f1559.slice/crio-3043ec0dfeb0bffde498189e1f0a1ecbf8ef4a8b58ab9b98b3484a5f26a63d7e WatchSource:0}: Error finding container 3043ec0dfeb0bffde498189e1f0a1ecbf8ef4a8b58ab9b98b3484a5f26a63d7e: Status 404 returned error can't find the container with id 3043ec0dfeb0bffde498189e1f0a1ecbf8ef4a8b58ab9b98b3484a5f26a63d7e Apr 17 16:31:45.961134 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.961091 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:26:43 +0000 UTC" deadline="2027-12-13 17:38:32.466237691 +0000 UTC" Apr 17 16:31:45.961134 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:45.961129 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14521h6m46.505112658s" Apr 17 16:31:46.050870 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:46.050444 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2wqxs" event={"ID":"2be0d6b4-a7ac-45cf-80dc-e5427b8f1559","Type":"ContainerStarted","Data":"3043ec0dfeb0bffde498189e1f0a1ecbf8ef4a8b58ab9b98b3484a5f26a63d7e"} Apr 17 16:31:46.053165 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:46.052758 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jp52r" event={"ID":"7051a978-dd0e-480e-93f4-b48b1dda0f32","Type":"ContainerStarted","Data":"6cac9000eaa872b1c52c130714011c7f3caa50f088c34e846c4260202eaed23f"} Apr 17 16:31:46.054638 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:46.054587 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hch47" event={"ID":"e4b20a08-0520-48d1-bce6-26fcc1371d10","Type":"ContainerStarted","Data":"ba3722c01f0316c5c8a961534b82ef8c61d0fdf7a46a761105fa6999b65f2262"} Apr 17 16:31:46.057310 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:46.057284 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-127.ec2.internal" event={"ID":"356446819b043d77b4ba2d5504f23404","Type":"ContainerStarted","Data":"fb11587b3fdb87450237c67247e5211201b4a52bdf6cb86a581d094ba02f0b68"} Apr 17 16:31:46.060206 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:46.060149 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k4c9b" event={"ID":"15557662-26a5-4d16-b9d6-e301ff3e11c6","Type":"ContainerStarted","Data":"d7d87b097d700013becb82b7f5663de78a7e270458d696311d52aa782ae87765"} Apr 17 16:31:46.066532 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:46.066503 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" event={"ID":"deef8b97-d137-4d1d-b5bf-258429691ce3","Type":"ContainerStarted","Data":"7b500be821f449ff5dbc3c6688c0a1e46b585eb77b9da82b6c2a9cc65174fecf"} Apr 17 16:31:46.071697 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:46.071669 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" event={"ID":"27d18268-52ee-45c5-b488-31fa18c8328d","Type":"ContainerStarted","Data":"55200da3de1c836b7826808ea5f32a59390f5dbda74cc1591a14e3f511d5bb8a"} Apr 17 16:31:46.074540 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:46.074515 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-c4d5f" event={"ID":"963aea58-ae9e-49da-b049-4fd51933dfd1","Type":"ContainerStarted","Data":"40c438849bbdb82a17fb4598eafb7081a8c47a38e4c248e8b31c86b8b42145b5"} Apr 17 16:31:46.077988 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:46.077964 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5n4q5" event={"ID":"6a215469-2ba6-4a12-bd40-a197844067ed","Type":"ContainerStarted","Data":"15395e4d17b2cde9e3ed819ca930f680e8eac0c05e2a189635e3513f63b7025b"} Apr 17 16:31:46.085976 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:46.085923 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" event={"ID":"97b20cee-5673-4b39-a3f9-105d0d794713","Type":"ContainerStarted","Data":"3a7ff64fc7aae3d6ad0052e3f60f100ab86ef736650950aa1775097f46f246c4"} Apr 17 16:31:46.539790 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:46.539706 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4666c56f-3d86-4e16-a782-6a41f0fe8825-metrics-certs\") pod \"network-metrics-daemon-vtq9t\" (UID: \"4666c56f-3d86-4e16-a782-6a41f0fe8825\") " pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:31:46.539945 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:46.539900 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:46.540007 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:46.539964 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4666c56f-3d86-4e16-a782-6a41f0fe8825-metrics-certs podName:4666c56f-3d86-4e16-a782-6a41f0fe8825 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:48.539944327 +0000 UTC m=+6.062969893 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4666c56f-3d86-4e16-a782-6a41f0fe8825-metrics-certs") pod "network-metrics-daemon-vtq9t" (UID: "4666c56f-3d86-4e16-a782-6a41f0fe8825") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:46.740722 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:46.740681 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn6cw\" (UniqueName: \"kubernetes.io/projected/290ef757-149c-497a-85e3-cc6a8cd8fc45-kube-api-access-sn6cw\") pod \"network-check-target-hdwf7\" (UID: \"290ef757-149c-497a-85e3-cc6a8cd8fc45\") " pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:31:46.740903 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:46.740864 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:46.740903 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:46.740885 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:46.740903 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:46.740898 2569 projected.go:194] Error preparing data for projected volume kube-api-access-sn6cw for pod openshift-network-diagnostics/network-check-target-hdwf7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:46.741044 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:46.740968 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/290ef757-149c-497a-85e3-cc6a8cd8fc45-kube-api-access-sn6cw podName:290ef757-149c-497a-85e3-cc6a8cd8fc45 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:48.740950067 +0000 UTC m=+6.263975651 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-sn6cw" (UniqueName: "kubernetes.io/projected/290ef757-149c-497a-85e3-cc6a8cd8fc45-kube-api-access-sn6cw") pod "network-check-target-hdwf7" (UID: "290ef757-149c-497a-85e3-cc6a8cd8fc45") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:46.877842 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:46.877764 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:47.036736 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:47.036691 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:31:47.037211 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:47.036865 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hdwf7" podUID="290ef757-149c-497a-85e3-cc6a8cd8fc45" Apr 17 16:31:47.037404 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:47.037385 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:31:47.037514 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:47.037493 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtq9t" podUID="4666c56f-3d86-4e16-a782-6a41f0fe8825" Apr 17 16:31:47.111677 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:47.111638 2569 generic.go:358] "Generic (PLEG): container finished" podID="d5d09bbd1af6f808e94311449d7cd444" containerID="4d6c3470ce62bd96fbe70824b1d7b5b7bcbd6e19423e54f79524fc088adbeb91" exitCode=0 Apr 17 16:31:47.113561 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:47.111745 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal" event={"ID":"d5d09bbd1af6f808e94311449d7cd444","Type":"ContainerDied","Data":"4d6c3470ce62bd96fbe70824b1d7b5b7bcbd6e19423e54f79524fc088adbeb91"} Apr 17 16:31:47.133110 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:47.132796 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-127.ec2.internal" podStartSLOduration=3.132778781 podStartE2EDuration="3.132778781s" podCreationTimestamp="2026-04-17 16:31:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:31:46.076700205 +0000 UTC m=+3.599725820" watchObservedRunningTime="2026-04-17 16:31:47.132778781 +0000 UTC m=+4.655804365" Apr 17 16:31:48.120082 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:48.119524 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal" event={"ID":"d5d09bbd1af6f808e94311449d7cd444","Type":"ContainerStarted","Data":"91ed4d72892bb97548d9a835b645c3e8745bff5234647051fff667d9d9bee122"} Apr 17 16:31:48.557106 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:48.557008 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4666c56f-3d86-4e16-a782-6a41f0fe8825-metrics-certs\") pod \"network-metrics-daemon-vtq9t\" (UID: \"4666c56f-3d86-4e16-a782-6a41f0fe8825\") " pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:31:48.557317 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:48.557157 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:48.557317 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:48.557224 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4666c56f-3d86-4e16-a782-6a41f0fe8825-metrics-certs podName:4666c56f-3d86-4e16-a782-6a41f0fe8825 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:52.557205159 +0000 UTC m=+10.080230740 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4666c56f-3d86-4e16-a782-6a41f0fe8825-metrics-certs") pod "network-metrics-daemon-vtq9t" (UID: "4666c56f-3d86-4e16-a782-6a41f0fe8825") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:48.759106 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:48.759067 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn6cw\" (UniqueName: \"kubernetes.io/projected/290ef757-149c-497a-85e3-cc6a8cd8fc45-kube-api-access-sn6cw\") pod \"network-check-target-hdwf7\" (UID: \"290ef757-149c-497a-85e3-cc6a8cd8fc45\") " pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:31:48.759337 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:48.759225 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:48.759337 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:48.759264 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:48.759337 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:48.759278 2569 projected.go:194] Error preparing data for projected volume kube-api-access-sn6cw for pod openshift-network-diagnostics/network-check-target-hdwf7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:48.759513 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:48.759344 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/290ef757-149c-497a-85e3-cc6a8cd8fc45-kube-api-access-sn6cw podName:290ef757-149c-497a-85e3-cc6a8cd8fc45 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:52.759325799 +0000 UTC m=+10.282351362 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-sn6cw" (UniqueName: "kubernetes.io/projected/290ef757-149c-497a-85e3-cc6a8cd8fc45-kube-api-access-sn6cw") pod "network-check-target-hdwf7" (UID: "290ef757-149c-497a-85e3-cc6a8cd8fc45") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:49.038159 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:49.037423 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:31:49.038159 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:49.037494 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:31:49.038159 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:49.037603 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtq9t" podUID="4666c56f-3d86-4e16-a782-6a41f0fe8825" Apr 17 16:31:49.038159 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:49.037689 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hdwf7" podUID="290ef757-149c-497a-85e3-cc6a8cd8fc45" Apr 17 16:31:51.037723 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:51.037516 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:31:51.037723 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:51.037649 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtq9t" podUID="4666c56f-3d86-4e16-a782-6a41f0fe8825" Apr 17 16:31:51.038234 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:51.038056 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:31:51.038234 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:51.038167 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hdwf7" podUID="290ef757-149c-497a-85e3-cc6a8cd8fc45" Apr 17 16:31:52.590037 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:52.589997 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4666c56f-3d86-4e16-a782-6a41f0fe8825-metrics-certs\") pod \"network-metrics-daemon-vtq9t\" (UID: \"4666c56f-3d86-4e16-a782-6a41f0fe8825\") " pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:31:52.590548 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:52.590194 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:52.590548 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:52.590270 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4666c56f-3d86-4e16-a782-6a41f0fe8825-metrics-certs podName:4666c56f-3d86-4e16-a782-6a41f0fe8825 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:00.59023777 +0000 UTC m=+18.113263346 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4666c56f-3d86-4e16-a782-6a41f0fe8825-metrics-certs") pod "network-metrics-daemon-vtq9t" (UID: "4666c56f-3d86-4e16-a782-6a41f0fe8825") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:52.791449 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:52.791336 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn6cw\" (UniqueName: \"kubernetes.io/projected/290ef757-149c-497a-85e3-cc6a8cd8fc45-kube-api-access-sn6cw\") pod \"network-check-target-hdwf7\" (UID: \"290ef757-149c-497a-85e3-cc6a8cd8fc45\") " pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:31:52.791625 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:52.791535 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:52.791625 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:52.791563 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:52.791625 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:52.791577 2569 projected.go:194] Error preparing data for projected volume kube-api-access-sn6cw for pod openshift-network-diagnostics/network-check-target-hdwf7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:52.791781 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:52.791639 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/290ef757-149c-497a-85e3-cc6a8cd8fc45-kube-api-access-sn6cw podName:290ef757-149c-497a-85e3-cc6a8cd8fc45 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:00.791618787 +0000 UTC m=+18.314644366 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-sn6cw" (UniqueName: "kubernetes.io/projected/290ef757-149c-497a-85e3-cc6a8cd8fc45-kube-api-access-sn6cw") pod "network-check-target-hdwf7" (UID: "290ef757-149c-497a-85e3-cc6a8cd8fc45") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:53.041768 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:53.041687 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:31:53.041929 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:53.041821 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtq9t" podUID="4666c56f-3d86-4e16-a782-6a41f0fe8825" Apr 17 16:31:53.042230 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:53.042211 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:31:53.042333 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:53.042315 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hdwf7" podUID="290ef757-149c-497a-85e3-cc6a8cd8fc45" Apr 17 16:31:55.037374 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:55.037328 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:31:55.037856 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:55.037348 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:31:55.037856 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:55.037462 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hdwf7" podUID="290ef757-149c-497a-85e3-cc6a8cd8fc45" Apr 17 16:31:55.037856 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:55.037522 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtq9t" podUID="4666c56f-3d86-4e16-a782-6a41f0fe8825" Apr 17 16:31:57.039298 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:57.039269 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:31:57.039695 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:57.039277 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:31:57.039695 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:57.039371 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hdwf7" podUID="290ef757-149c-497a-85e3-cc6a8cd8fc45" Apr 17 16:31:57.039695 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:57.039486 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtq9t" podUID="4666c56f-3d86-4e16-a782-6a41f0fe8825" Apr 17 16:31:59.036750 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:59.036714 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:31:59.037184 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:31:59.036714 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:31:59.037184 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:59.036859 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtq9t" podUID="4666c56f-3d86-4e16-a782-6a41f0fe8825" Apr 17 16:31:59.037184 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:31:59.036926 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hdwf7" podUID="290ef757-149c-497a-85e3-cc6a8cd8fc45" Apr 17 16:32:00.644294 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:00.644237 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4666c56f-3d86-4e16-a782-6a41f0fe8825-metrics-certs\") pod \"network-metrics-daemon-vtq9t\" (UID: \"4666c56f-3d86-4e16-a782-6a41f0fe8825\") " pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:32:00.644780 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:00.644383 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:32:00.644780 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:00.644464 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4666c56f-3d86-4e16-a782-6a41f0fe8825-metrics-certs podName:4666c56f-3d86-4e16-a782-6a41f0fe8825 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:16.644445125 +0000 UTC m=+34.167470690 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4666c56f-3d86-4e16-a782-6a41f0fe8825-metrics-certs") pod "network-metrics-daemon-vtq9t" (UID: "4666c56f-3d86-4e16-a782-6a41f0fe8825") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:32:00.845953 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:00.845913 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn6cw\" (UniqueName: \"kubernetes.io/projected/290ef757-149c-497a-85e3-cc6a8cd8fc45-kube-api-access-sn6cw\") pod \"network-check-target-hdwf7\" (UID: \"290ef757-149c-497a-85e3-cc6a8cd8fc45\") " pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:32:00.846121 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:00.846057 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:32:00.846121 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:00.846073 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:32:00.846121 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:00.846083 2569 projected.go:194] Error preparing data for projected volume kube-api-access-sn6cw for pod openshift-network-diagnostics/network-check-target-hdwf7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:32:00.846272 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:00.846142 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/290ef757-149c-497a-85e3-cc6a8cd8fc45-kube-api-access-sn6cw podName:290ef757-149c-497a-85e3-cc6a8cd8fc45 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:16.846126267 +0000 UTC m=+34.369151857 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-sn6cw" (UniqueName: "kubernetes.io/projected/290ef757-149c-497a-85e3-cc6a8cd8fc45-kube-api-access-sn6cw") pod "network-check-target-hdwf7" (UID: "290ef757-149c-497a-85e3-cc6a8cd8fc45") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:32:01.037508 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:01.037420 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:32:01.037657 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:01.037551 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hdwf7" podUID="290ef757-149c-497a-85e3-cc6a8cd8fc45" Apr 17 16:32:01.037657 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:01.037603 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:32:01.037772 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:01.037725 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtq9t" podUID="4666c56f-3d86-4e16-a782-6a41f0fe8825" Apr 17 16:32:03.037475 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:03.037299 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:32:03.038141 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:03.037295 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:32:03.038141 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:03.037554 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtq9t" podUID="4666c56f-3d86-4e16-a782-6a41f0fe8825" Apr 17 16:32:03.038141 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:03.037601 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hdwf7" podUID="290ef757-149c-497a-85e3-cc6a8cd8fc45" Apr 17 16:32:03.146849 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:03.146587 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k4c9b" event={"ID":"15557662-26a5-4d16-b9d6-e301ff3e11c6","Type":"ContainerStarted","Data":"caa38b496e7cff1e6e50f990f3e0c257b1263b4f61ec459e5f23119506079fc3"} Apr 17 16:32:03.148022 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:03.147996 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" event={"ID":"deef8b97-d137-4d1d-b5bf-258429691ce3","Type":"ContainerStarted","Data":"0e105dbea479a348bb07bc4e8551833cb5a9a4fab5c2b1179c0243fd9c86967d"} Apr 17 16:32:03.149418 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:03.149393 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" event={"ID":"27d18268-52ee-45c5-b488-31fa18c8328d","Type":"ContainerStarted","Data":"cd1db6df2d0f2443413c9663086d184a0aa9b00eba2bc7830d84ca4dcb5af135"} Apr 17 16:32:03.150751 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:03.150706 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5n4q5" event={"ID":"6a215469-2ba6-4a12-bd40-a197844067ed","Type":"ContainerStarted","Data":"8055aaa18d6577dc66552d51f4b7ee9ed3363d69f317939417c3827c6e467c65"} Apr 17 16:32:03.152017 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:03.151992 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2wqxs" event={"ID":"2be0d6b4-a7ac-45cf-80dc-e5427b8f1559","Type":"ContainerStarted","Data":"06b51f9782efad33966a40c149f31e5d170f878ff8fd65b4edb2ce2908f1ce7f"} Apr 17 16:32:03.153115 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:03.153086 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jp52r" event={"ID":"7051a978-dd0e-480e-93f4-b48b1dda0f32","Type":"ContainerStarted","Data":"df510de2b49660d24e2d6749946771a45224d70bd1462ce9c9800d460780df72"} Apr 17 16:32:03.154260 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:03.154226 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hch47" event={"ID":"e4b20a08-0520-48d1-bce6-26fcc1371d10","Type":"ContainerStarted","Data":"5e7385dcc61664b162346ba524039292348f7b9de19290f8947132cf3084b8c3"} Apr 17 16:32:03.221971 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:03.221920 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal" podStartSLOduration=19.221906864 podStartE2EDuration="19.221906864s" podCreationTimestamp="2026-04-17 16:31:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:31:48.135603806 +0000 UTC m=+5.658629392" watchObservedRunningTime="2026-04-17 16:32:03.221906864 +0000 UTC m=+20.744932448" Apr 17 16:32:03.241913 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:03.241863 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-hch47" podStartSLOduration=3.220372937 podStartE2EDuration="20.24184796s" podCreationTimestamp="2026-04-17 16:31:43 +0000 UTC" firstStartedPulling="2026-04-17 16:31:45.542577048 +0000 UTC m=+3.065602615" lastFinishedPulling="2026-04-17 16:32:02.564052062 +0000 UTC m=+20.087077638" observedRunningTime="2026-04-17 16:32:03.241101939 +0000 UTC m=+20.764127523" watchObservedRunningTime="2026-04-17 16:32:03.24184796 +0000 UTC m=+20.764873545" Apr 17 16:32:03.250148 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:03.250124 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2wqxs_2be0d6b4-a7ac-45cf-80dc-e5427b8f1559/dns-node-resolver/0.log" Apr 17 16:32:03.262852 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:03.262644 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-7rxj5" podStartSLOduration=3.207629986 podStartE2EDuration="20.262627281s" podCreationTimestamp="2026-04-17 16:31:43 +0000 UTC" firstStartedPulling="2026-04-17 16:31:45.54552612 +0000 UTC m=+3.068551696" lastFinishedPulling="2026-04-17 16:32:02.600523415 +0000 UTC m=+20.123548991" observedRunningTime="2026-04-17 16:32:03.261997182 +0000 UTC m=+20.785022767" watchObservedRunningTime="2026-04-17 16:32:03.262627281 +0000 UTC m=+20.785652866" Apr 17 16:32:03.284221 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:03.284168 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5n4q5" podStartSLOduration=3.202808815 podStartE2EDuration="20.284150981s" podCreationTimestamp="2026-04-17 16:31:43 +0000 UTC" firstStartedPulling="2026-04-17 16:31:45.536485246 +0000 UTC m=+3.059510814" lastFinishedPulling="2026-04-17 16:32:02.617827418 +0000 UTC m=+20.140852980" observedRunningTime="2026-04-17 16:32:03.28369985 +0000 UTC m=+20.806725436" watchObservedRunningTime="2026-04-17 16:32:03.284150981 +0000 UTC m=+20.807176585" Apr 17 16:32:03.299153 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:03.299064 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-jp52r" podStartSLOduration=7.982612069 podStartE2EDuration="20.299045594s" podCreationTimestamp="2026-04-17 16:31:43 +0000 UTC" firstStartedPulling="2026-04-17 16:31:45.544321261 +0000 UTC m=+3.067346830" lastFinishedPulling="2026-04-17 16:31:57.860754786 +0000 UTC m=+15.383780355" observedRunningTime="2026-04-17 16:32:03.299020991 +0000 UTC m=+20.822046576" watchObservedRunningTime="2026-04-17 16:32:03.299045594 +0000 UTC m=+20.822071173" Apr 17 16:32:03.314386 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:03.314300 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2wqxs" podStartSLOduration=1.56938338 podStartE2EDuration="18.314280292s" podCreationTimestamp="2026-04-17 16:31:45 +0000 UTC" firstStartedPulling="2026-04-17 16:31:45.855861309 +0000 UTC m=+3.378886874" lastFinishedPulling="2026-04-17 16:32:02.600758207 +0000 UTC m=+20.123783786" observedRunningTime="2026-04-17 16:32:03.31404661 +0000 UTC m=+20.837072198" watchObservedRunningTime="2026-04-17 16:32:03.314280292 +0000 UTC m=+20.837305874" Apr 17 16:32:04.158268 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:04.158217 2569 generic.go:358] "Generic (PLEG): container finished" podID="15557662-26a5-4d16-b9d6-e301ff3e11c6" containerID="caa38b496e7cff1e6e50f990f3e0c257b1263b4f61ec459e5f23119506079fc3" exitCode=0 Apr 17 16:32:04.158667 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:04.158335 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k4c9b" event={"ID":"15557662-26a5-4d16-b9d6-e301ff3e11c6","Type":"ContainerDied","Data":"caa38b496e7cff1e6e50f990f3e0c257b1263b4f61ec459e5f23119506079fc3"} Apr 17 16:32:04.160410 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:04.160117 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-c4d5f" event={"ID":"963aea58-ae9e-49da-b049-4fd51933dfd1","Type":"ContainerStarted","Data":"9d99ddd4b6881e83793d093aa3fbdd5e08164c03d11e50c8a7a0dc7d53de315b"} Apr 17 16:32:04.166394 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:04.166366 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" event={"ID":"97b20cee-5673-4b39-a3f9-105d0d794713","Type":"ContainerStarted","Data":"cfcc64a64b9bcd1554139f9c4924a679c77ab8e280eb9620a7145403edc12f7d"} Apr 17 16:32:04.166495 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:04.166403 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" event={"ID":"97b20cee-5673-4b39-a3f9-105d0d794713","Type":"ContainerStarted","Data":"317202a183fceed7e2a28b31bc6f4a23ead299dab8ae285a44057d081066682a"} Apr 17 16:32:04.166495 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:04.166419 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" event={"ID":"97b20cee-5673-4b39-a3f9-105d0d794713","Type":"ContainerStarted","Data":"ed0e8bd9321d38cd935aff1fd80b4e1e0ca52b1fd1516e4731413de30f3b5f70"} Apr 17 16:32:04.166495 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:04.166430 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" event={"ID":"97b20cee-5673-4b39-a3f9-105d0d794713","Type":"ContainerStarted","Data":"6190e5c11c85e0d649636e8f9fc3cf945ee960db0648ac64a1ee8e76130a07e7"} Apr 17 16:32:04.166495 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:04.166443 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" event={"ID":"97b20cee-5673-4b39-a3f9-105d0d794713","Type":"ContainerStarted","Data":"cdc1eb2d418cf2b7811ff47113d75295e875c1fb5fd160755b7f5587f4235a43"} Apr 17 16:32:04.194563 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:04.194520 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-c4d5f" podStartSLOduration=4.123842659 podStartE2EDuration="21.194506528s" podCreationTimestamp="2026-04-17 16:31:43 +0000 UTC" firstStartedPulling="2026-04-17 16:31:45.535131908 +0000 UTC m=+3.058157474" lastFinishedPulling="2026-04-17 16:32:02.605795775 +0000 UTC m=+20.128821343" observedRunningTime="2026-04-17 16:32:04.194312211 +0000 UTC m=+21.717337792" watchObservedRunningTime="2026-04-17 16:32:04.194506528 +0000 UTC m=+21.717532132" Apr 17 16:32:04.234641 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:04.234612 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 16:32:04.424511 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:04.424293 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hch47_e4b20a08-0520-48d1-bce6-26fcc1371d10/node-ca/0.log" Apr 17 16:32:04.978436 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:04.978329 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T16:32:04.234634707Z","UUID":"612c64ee-7436-4e5b-8d11-1f24c54f7f04","Handler":null,"Name":"","Endpoint":""} Apr 17 16:32:04.981674 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:04.981644 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 16:32:04.981810 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:04.981683 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 16:32:05.037576 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:05.037522 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:32:05.037765 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:05.037669 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtq9t" podUID="4666c56f-3d86-4e16-a782-6a41f0fe8825" Apr 17 16:32:05.037836 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:05.037759 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:32:05.037888 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:05.037864 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hdwf7" podUID="290ef757-149c-497a-85e3-cc6a8cd8fc45" Apr 17 16:32:05.169849 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:05.169798 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" event={"ID":"27d18268-52ee-45c5-b488-31fa18c8328d","Type":"ContainerStarted","Data":"c671ce1be09edecbc3fe5f2a7dbbba28cbd1fd37622803589e314e05edf7ddf4"} Apr 17 16:32:05.172687 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:05.172649 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" event={"ID":"97b20cee-5673-4b39-a3f9-105d0d794713","Type":"ContainerStarted","Data":"9a821309d3213712b90141a6b0a7d93791d73366ecb47ef48e5c7f2e05e6fe6d"} Apr 17 16:32:06.176693 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:06.176658 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" event={"ID":"27d18268-52ee-45c5-b488-31fa18c8328d","Type":"ContainerStarted","Data":"05ad2fdef7a5401592133921a105355421cf4495fb02b1bffd5a55aa57078162"} Apr 17 16:32:06.194441 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:06.194394 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r5nzv" podStartSLOduration=3.505064242 podStartE2EDuration="23.194374453s" podCreationTimestamp="2026-04-17 16:31:43 +0000 UTC" firstStartedPulling="2026-04-17 16:31:45.540464274 +0000 UTC m=+3.063489846" lastFinishedPulling="2026-04-17 16:32:05.229774481 +0000 UTC m=+22.752800057" observedRunningTime="2026-04-17 16:32:06.194343397 +0000 UTC m=+23.717368981" watchObservedRunningTime="2026-04-17 16:32:06.194374453 +0000 UTC m=+23.717400042" Apr 17 16:32:07.036905 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:07.036863 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:32:07.037089 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:07.036873 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:32:07.037089 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:07.037022 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtq9t" podUID="4666c56f-3d86-4e16-a782-6a41f0fe8825" Apr 17 16:32:07.037089 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:07.037060 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hdwf7" podUID="290ef757-149c-497a-85e3-cc6a8cd8fc45" Apr 17 16:32:07.137054 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:07.137011 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-jp52r" Apr 17 16:32:07.137731 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:07.137708 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-jp52r" Apr 17 16:32:07.182512 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:07.182454 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" event={"ID":"97b20cee-5673-4b39-a3f9-105d0d794713","Type":"ContainerStarted","Data":"bb67459bc9e9d2da522d1764a277d2bd839525a6a5d9503cab5d607fec89502c"} Apr 17 16:32:07.183214 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:07.182709 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-jp52r" Apr 17 16:32:07.183569 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:07.183546 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-jp52r" Apr 17 16:32:09.037139 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:09.036949 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:32:09.037686 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:09.036951 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:32:09.037686 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:09.037228 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hdwf7" podUID="290ef757-149c-497a-85e3-cc6a8cd8fc45" Apr 17 16:32:09.037686 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:09.037283 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtq9t" podUID="4666c56f-3d86-4e16-a782-6a41f0fe8825" Apr 17 16:32:09.188475 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:09.188440 2569 generic.go:358] "Generic (PLEG): container finished" podID="15557662-26a5-4d16-b9d6-e301ff3e11c6" containerID="346108714f9a403be1fc4c6497e88c090d9f32e068c31a5d76a676804751f5be" exitCode=0 Apr 17 16:32:09.188636 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:09.188525 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k4c9b" event={"ID":"15557662-26a5-4d16-b9d6-e301ff3e11c6","Type":"ContainerDied","Data":"346108714f9a403be1fc4c6497e88c090d9f32e068c31a5d76a676804751f5be"} Apr 17 16:32:09.191790 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:09.191767 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" event={"ID":"97b20cee-5673-4b39-a3f9-105d0d794713","Type":"ContainerStarted","Data":"86b0f0bd68a7fbf5fbe2289155234ba9fda207524799efefe646661cb2ce2ce3"} Apr 17 16:32:09.192094 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:09.192079 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:32:09.192221 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:09.192101 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:32:09.192221 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:09.192110 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:32:09.206645 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:09.206628 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:32:09.206731 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:09.206685 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:32:09.234301 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:09.234240 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" podStartSLOduration=8.67413845 podStartE2EDuration="26.234228253s" podCreationTimestamp="2026-04-17 16:31:43 +0000 UTC" firstStartedPulling="2026-04-17 16:31:45.534042923 +0000 UTC m=+3.057068489" lastFinishedPulling="2026-04-17 16:32:03.094132717 +0000 UTC m=+20.617158292" observedRunningTime="2026-04-17 16:32:09.2338826 +0000 UTC m=+26.756908183" watchObservedRunningTime="2026-04-17 16:32:09.234228253 +0000 UTC m=+26.757253837" Apr 17 16:32:10.196398 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:10.196370 2569 generic.go:358] "Generic (PLEG): container finished" podID="15557662-26a5-4d16-b9d6-e301ff3e11c6" containerID="6b415890d736a57cd4d9497ea34b9a01e2bdde7f0e9dd3f2956fceeab9b69147" exitCode=0 Apr 17 16:32:10.196759 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:10.196456 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k4c9b" event={"ID":"15557662-26a5-4d16-b9d6-e301ff3e11c6","Type":"ContainerDied","Data":"6b415890d736a57cd4d9497ea34b9a01e2bdde7f0e9dd3f2956fceeab9b69147"} Apr 17 16:32:10.409730 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:10.409476 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vtq9t"] Apr 17 16:32:10.410448 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:10.410065 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hdwf7"] Apr 17 16:32:10.410448 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:10.410142 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:32:10.410448 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:10.410169 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:32:10.410448 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:10.410298 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtq9t" podUID="4666c56f-3d86-4e16-a782-6a41f0fe8825" Apr 17 16:32:10.410448 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:10.410405 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hdwf7" podUID="290ef757-149c-497a-85e3-cc6a8cd8fc45" Apr 17 16:32:11.200310 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:11.200208 2569 generic.go:358] "Generic (PLEG): container finished" podID="15557662-26a5-4d16-b9d6-e301ff3e11c6" containerID="b6309af6df68c39e716590f3a49049af3ee925edd0d13c28f03b2f302fae74d8" exitCode=0 Apr 17 16:32:11.200310 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:11.200280 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k4c9b" event={"ID":"15557662-26a5-4d16-b9d6-e301ff3e11c6","Type":"ContainerDied","Data":"b6309af6df68c39e716590f3a49049af3ee925edd0d13c28f03b2f302fae74d8"} Apr 17 16:32:12.037295 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:12.037244 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:32:12.037482 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:12.037244 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:32:12.037482 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:12.037385 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hdwf7" podUID="290ef757-149c-497a-85e3-cc6a8cd8fc45" Apr 17 16:32:12.037575 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:12.037479 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtq9t" podUID="4666c56f-3d86-4e16-a782-6a41f0fe8825" Apr 17 16:32:14.036934 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:14.036904 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:32:14.037666 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:14.036904 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:32:14.037666 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:14.037046 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtq9t" podUID="4666c56f-3d86-4e16-a782-6a41f0fe8825" Apr 17 16:32:14.037666 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:14.037093 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hdwf7" podUID="290ef757-149c-497a-85e3-cc6a8cd8fc45" Apr 17 16:32:16.036496 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:16.036465 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:32:16.037065 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:16.036465 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:32:16.037065 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:16.036592 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtq9t" podUID="4666c56f-3d86-4e16-a782-6a41f0fe8825" Apr 17 16:32:16.037065 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:16.036668 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hdwf7" podUID="290ef757-149c-497a-85e3-cc6a8cd8fc45" Apr 17 16:32:16.658457 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:16.658420 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4666c56f-3d86-4e16-a782-6a41f0fe8825-metrics-certs\") pod \"network-metrics-daemon-vtq9t\" (UID: \"4666c56f-3d86-4e16-a782-6a41f0fe8825\") " pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:32:16.658639 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:16.658562 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:32:16.658639 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:16.658636 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4666c56f-3d86-4e16-a782-6a41f0fe8825-metrics-certs podName:4666c56f-3d86-4e16-a782-6a41f0fe8825 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:48.658616503 +0000 UTC m=+66.181642069 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4666c56f-3d86-4e16-a782-6a41f0fe8825-metrics-certs") pod "network-metrics-daemon-vtq9t" (UID: "4666c56f-3d86-4e16-a782-6a41f0fe8825") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:32:16.859712 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:16.859680 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn6cw\" (UniqueName: \"kubernetes.io/projected/290ef757-149c-497a-85e3-cc6a8cd8fc45-kube-api-access-sn6cw\") pod \"network-check-target-hdwf7\" (UID: \"290ef757-149c-497a-85e3-cc6a8cd8fc45\") " pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:32:16.859874 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:16.859828 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:32:16.859874 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:16.859843 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:32:16.859874 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:16.859852 2569 projected.go:194] Error preparing data for projected volume kube-api-access-sn6cw for pod openshift-network-diagnostics/network-check-target-hdwf7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:32:16.859973 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:16.859897 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/290ef757-149c-497a-85e3-cc6a8cd8fc45-kube-api-access-sn6cw podName:290ef757-149c-497a-85e3-cc6a8cd8fc45 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:48.859884909 +0000 UTC m=+66.382910476 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-sn6cw" (UniqueName: "kubernetes.io/projected/290ef757-149c-497a-85e3-cc6a8cd8fc45-kube-api-access-sn6cw") pod "network-check-target-hdwf7" (UID: "290ef757-149c-497a-85e3-cc6a8cd8fc45") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:32:18.037471 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:18.037435 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:32:18.037939 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:18.037435 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:32:18.037939 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:18.037545 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtq9t" podUID="4666c56f-3d86-4e16-a782-6a41f0fe8825" Apr 17 16:32:18.037939 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:18.037616 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hdwf7" podUID="290ef757-149c-497a-85e3-cc6a8cd8fc45" Apr 17 16:32:18.215763 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:18.215727 2569 generic.go:358] "Generic (PLEG): container finished" podID="15557662-26a5-4d16-b9d6-e301ff3e11c6" containerID="1e409704a4f4529b6eae84066194604bf892b421573e2af0b9415c4e328db16e" exitCode=0 Apr 17 16:32:18.215923 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:18.215786 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k4c9b" event={"ID":"15557662-26a5-4d16-b9d6-e301ff3e11c6","Type":"ContainerDied","Data":"1e409704a4f4529b6eae84066194604bf892b421573e2af0b9415c4e328db16e"} Apr 17 16:32:19.219761 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:19.219729 2569 generic.go:358] "Generic (PLEG): container finished" podID="15557662-26a5-4d16-b9d6-e301ff3e11c6" containerID="0092e360f12d81185b50bbfbdc4c741fd502633b52a1e4d539464233c59ca438" exitCode=0 Apr 17 16:32:19.220223 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:19.219785 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k4c9b" event={"ID":"15557662-26a5-4d16-b9d6-e301ff3e11c6","Type":"ContainerDied","Data":"0092e360f12d81185b50bbfbdc4c741fd502633b52a1e4d539464233c59ca438"} Apr 17 16:32:20.036796 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:20.036594 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:32:20.036957 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:20.036602 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:32:20.036957 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:20.036871 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hdwf7" podUID="290ef757-149c-497a-85e3-cc6a8cd8fc45" Apr 17 16:32:20.037112 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:20.036977 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtq9t" podUID="4666c56f-3d86-4e16-a782-6a41f0fe8825" Apr 17 16:32:20.224190 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:20.224095 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k4c9b" event={"ID":"15557662-26a5-4d16-b9d6-e301ff3e11c6","Type":"ContainerStarted","Data":"8421f8c1c18f37b94777124a11a3289096494aa3b0562543251dcff5c5861d08"} Apr 17 16:32:20.251031 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:20.250979 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-k4c9b" podStartSLOduration=5.557702678 podStartE2EDuration="37.250966705s" podCreationTimestamp="2026-04-17 16:31:43 +0000 UTC" firstStartedPulling="2026-04-17 16:31:45.547010146 +0000 UTC m=+3.070035710" lastFinishedPulling="2026-04-17 16:32:17.240274174 +0000 UTC m=+34.763299737" observedRunningTime="2026-04-17 16:32:20.249447701 +0000 UTC m=+37.772473285" watchObservedRunningTime="2026-04-17 16:32:20.250966705 +0000 UTC m=+37.773992288" Apr 17 16:32:22.036840 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:22.036810 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:32:22.037218 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:22.036922 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtq9t" podUID="4666c56f-3d86-4e16-a782-6a41f0fe8825" Apr 17 16:32:22.037218 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:22.036992 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:32:22.037218 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:22.037068 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hdwf7" podUID="290ef757-149c-497a-85e3-cc6a8cd8fc45" Apr 17 16:32:24.037040 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:24.037006 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:32:24.037442 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:24.037014 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:32:24.037442 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:24.037115 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hdwf7" podUID="290ef757-149c-497a-85e3-cc6a8cd8fc45" Apr 17 16:32:24.037442 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:24.037199 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtq9t" podUID="4666c56f-3d86-4e16-a782-6a41f0fe8825" Apr 17 16:32:26.036817 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:26.036777 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:32:26.037296 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:26.036785 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:32:26.037296 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:26.036990 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtq9t" podUID="4666c56f-3d86-4e16-a782-6a41f0fe8825" Apr 17 16:32:26.037296 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:26.036875 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hdwf7" podUID="290ef757-149c-497a-85e3-cc6a8cd8fc45" Apr 17 16:32:28.037416 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:28.037384 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:32:28.037416 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:28.037426 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:32:28.037972 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:28.037554 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtq9t" podUID="4666c56f-3d86-4e16-a782-6a41f0fe8825" Apr 17 16:32:28.037972 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:28.037653 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hdwf7" podUID="290ef757-149c-497a-85e3-cc6a8cd8fc45" Apr 17 16:32:30.037006 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:30.036968 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:32:30.037426 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:30.036982 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:32:30.037426 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:30.037071 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hdwf7" podUID="290ef757-149c-497a-85e3-cc6a8cd8fc45" Apr 17 16:32:30.037426 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:30.037150 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtq9t" podUID="4666c56f-3d86-4e16-a782-6a41f0fe8825" Apr 17 16:32:32.036918 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:32.036884 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:32:32.037342 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:32.036897 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:32:32.037342 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:32.036983 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hdwf7" podUID="290ef757-149c-497a-85e3-cc6a8cd8fc45" Apr 17 16:32:32.037342 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:32.037064 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtq9t" podUID="4666c56f-3d86-4e16-a782-6a41f0fe8825" Apr 17 16:32:34.036821 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:34.036786 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:32:34.036821 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:34.036818 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:32:34.037291 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:34.036885 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hdwf7" podUID="290ef757-149c-497a-85e3-cc6a8cd8fc45" Apr 17 16:32:34.037291 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:34.036993 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtq9t" podUID="4666c56f-3d86-4e16-a782-6a41f0fe8825" Apr 17 16:32:36.037425 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:36.037392 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:32:36.037844 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:36.037392 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:32:36.037844 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:36.037491 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hdwf7" podUID="290ef757-149c-497a-85e3-cc6a8cd8fc45" Apr 17 16:32:36.037844 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:36.037607 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vtq9t" podUID="4666c56f-3d86-4e16-a782-6a41f0fe8825" Apr 17 16:32:37.750970 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:37.750893 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeReady" Apr 17 16:32:37.751344 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:37.751016 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 16:32:37.814064 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:37.814032 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-bhgll"] Apr 17 16:32:37.874148 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:37.874116 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-wbbbm"] Apr 17 16:32:37.874328 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:37.874287 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bhgll" Apr 17 16:32:37.876897 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:37.876872 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xz9tf\"" Apr 17 16:32:37.877060 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:37.876873 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 16:32:37.877060 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:37.876974 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 16:32:37.895184 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:37.895163 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bhgll"] Apr 17 16:32:37.895184 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:37.895187 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wbbbm"] Apr 17 16:32:37.895326 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:37.895305 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wbbbm" Apr 17 16:32:37.898001 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:37.897980 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 16:32:37.898143 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:37.898128 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 16:32:37.898224 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:37.898168 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vwmhs\"" Apr 17 16:32:37.898434 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:37.898419 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 16:32:37.898947 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:37.898931 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 16:32:37.914533 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:37.914515 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4jlzg"] Apr 17 16:32:37.934413 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:37.934392 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4jlzg"] Apr 17 16:32:37.934514 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:37.934491 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4jlzg" Apr 17 16:32:37.937125 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:37.937104 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 16:32:37.937236 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:37.937126 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 16:32:37.937236 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:37.937182 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 16:32:37.937236 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:37.937194 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cnl7t\"" Apr 17 16:32:38.011628 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.011540 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/02278729-f9e5-4615-9be2-2f650d08b858-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wbbbm\" (UID: \"02278729-f9e5-4615-9be2-2f650d08b858\") " pod="openshift-insights/insights-runtime-extractor-wbbbm" Apr 17 16:32:38.011628 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.011579 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/995d1a32-09a5-4100-bd91-c6ccdf96086d-metrics-tls\") pod \"dns-default-bhgll\" (UID: \"995d1a32-09a5-4100-bd91-c6ccdf96086d\") " pod="openshift-dns/dns-default-bhgll" Apr 17 16:32:38.011628 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.011598 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nps5\" (UniqueName: \"kubernetes.io/projected/02278729-f9e5-4615-9be2-2f650d08b858-kube-api-access-5nps5\") pod \"insights-runtime-extractor-wbbbm\" (UID: \"02278729-f9e5-4615-9be2-2f650d08b858\") " pod="openshift-insights/insights-runtime-extractor-wbbbm" Apr 17 16:32:38.011908 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.011659 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs6s8\" (UniqueName: \"kubernetes.io/projected/995d1a32-09a5-4100-bd91-c6ccdf96086d-kube-api-access-bs6s8\") pod \"dns-default-bhgll\" (UID: \"995d1a32-09a5-4100-bd91-c6ccdf96086d\") " pod="openshift-dns/dns-default-bhgll" Apr 17 16:32:38.011908 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.011694 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/02278729-f9e5-4615-9be2-2f650d08b858-crio-socket\") pod \"insights-runtime-extractor-wbbbm\" (UID: \"02278729-f9e5-4615-9be2-2f650d08b858\") " pod="openshift-insights/insights-runtime-extractor-wbbbm" Apr 17 16:32:38.011908 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.011748 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/02278729-f9e5-4615-9be2-2f650d08b858-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wbbbm\" (UID: \"02278729-f9e5-4615-9be2-2f650d08b858\") " pod="openshift-insights/insights-runtime-extractor-wbbbm" Apr 17 16:32:38.011908 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.011777 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/02278729-f9e5-4615-9be2-2f650d08b858-data-volume\") pod \"insights-runtime-extractor-wbbbm\" (UID: \"02278729-f9e5-4615-9be2-2f650d08b858\") " pod="openshift-insights/insights-runtime-extractor-wbbbm" Apr 17 16:32:38.011908 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.011804 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/995d1a32-09a5-4100-bd91-c6ccdf96086d-tmp-dir\") pod \"dns-default-bhgll\" (UID: \"995d1a32-09a5-4100-bd91-c6ccdf96086d\") " pod="openshift-dns/dns-default-bhgll" Apr 17 16:32:38.011908 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.011839 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/995d1a32-09a5-4100-bd91-c6ccdf96086d-config-volume\") pod \"dns-default-bhgll\" (UID: \"995d1a32-09a5-4100-bd91-c6ccdf96086d\") " pod="openshift-dns/dns-default-bhgll" Apr 17 16:32:38.036807 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.036781 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:32:38.036896 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.036781 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:32:38.039664 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.039638 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:32:38.039664 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.039638 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:32:38.039862 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.039638 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:32:38.039862 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.039638 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fwf7s\"" Apr 17 16:32:38.039862 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.039661 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-6m66z\"" Apr 17 16:32:38.112504 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.112467 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/995d1a32-09a5-4100-bd91-c6ccdf96086d-metrics-tls\") pod \"dns-default-bhgll\" (UID: \"995d1a32-09a5-4100-bd91-c6ccdf96086d\") " pod="openshift-dns/dns-default-bhgll" Apr 17 16:32:38.112504 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.112504 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nps5\" (UniqueName: \"kubernetes.io/projected/02278729-f9e5-4615-9be2-2f650d08b858-kube-api-access-5nps5\") pod \"insights-runtime-extractor-wbbbm\" (UID: \"02278729-f9e5-4615-9be2-2f650d08b858\") " pod="openshift-insights/insights-runtime-extractor-wbbbm" Apr 17 16:32:38.112708 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.112523 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bs6s8\" (UniqueName: \"kubernetes.io/projected/995d1a32-09a5-4100-bd91-c6ccdf96086d-kube-api-access-bs6s8\") pod \"dns-default-bhgll\" (UID: \"995d1a32-09a5-4100-bd91-c6ccdf96086d\") " pod="openshift-dns/dns-default-bhgll" Apr 17 16:32:38.112708 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.112540 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/02278729-f9e5-4615-9be2-2f650d08b858-crio-socket\") pod \"insights-runtime-extractor-wbbbm\" (UID: \"02278729-f9e5-4615-9be2-2f650d08b858\") " pod="openshift-insights/insights-runtime-extractor-wbbbm" Apr 17 16:32:38.112708 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.112632 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/02278729-f9e5-4615-9be2-2f650d08b858-crio-socket\") pod \"insights-runtime-extractor-wbbbm\" (UID: \"02278729-f9e5-4615-9be2-2f650d08b858\") " pod="openshift-insights/insights-runtime-extractor-wbbbm" Apr 17 16:32:38.112708 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.112699 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/02278729-f9e5-4615-9be2-2f650d08b858-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wbbbm\" (UID: \"02278729-f9e5-4615-9be2-2f650d08b858\") " pod="openshift-insights/insights-runtime-extractor-wbbbm" Apr 17 16:32:38.112901 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.112733 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/02278729-f9e5-4615-9be2-2f650d08b858-data-volume\") pod \"insights-runtime-extractor-wbbbm\" (UID: \"02278729-f9e5-4615-9be2-2f650d08b858\") " pod="openshift-insights/insights-runtime-extractor-wbbbm" Apr 17 16:32:38.112901 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.112756 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/995d1a32-09a5-4100-bd91-c6ccdf96086d-tmp-dir\") pod \"dns-default-bhgll\" (UID: \"995d1a32-09a5-4100-bd91-c6ccdf96086d\") " pod="openshift-dns/dns-default-bhgll" Apr 17 16:32:38.112901 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.112875 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/995d1a32-09a5-4100-bd91-c6ccdf96086d-config-volume\") pod \"dns-default-bhgll\" (UID: \"995d1a32-09a5-4100-bd91-c6ccdf96086d\") " pod="openshift-dns/dns-default-bhgll" Apr 17 16:32:38.113051 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.112914 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28bd3d62-5065-4e51-a02d-686fb319fffc-cert\") pod \"ingress-canary-4jlzg\" (UID: \"28bd3d62-5065-4e51-a02d-686fb319fffc\") " pod="openshift-ingress-canary/ingress-canary-4jlzg" Apr 17 16:32:38.113051 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.112948 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cpbt\" (UniqueName: \"kubernetes.io/projected/28bd3d62-5065-4e51-a02d-686fb319fffc-kube-api-access-7cpbt\") pod \"ingress-canary-4jlzg\" (UID: \"28bd3d62-5065-4e51-a02d-686fb319fffc\") " pod="openshift-ingress-canary/ingress-canary-4jlzg" Apr 17 16:32:38.113153 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.113049 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/02278729-f9e5-4615-9be2-2f650d08b858-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wbbbm\" (UID: \"02278729-f9e5-4615-9be2-2f650d08b858\") " pod="openshift-insights/insights-runtime-extractor-wbbbm" Apr 17 16:32:38.113153 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.113068 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/995d1a32-09a5-4100-bd91-c6ccdf96086d-tmp-dir\") pod \"dns-default-bhgll\" (UID: \"995d1a32-09a5-4100-bd91-c6ccdf96086d\") " pod="openshift-dns/dns-default-bhgll" Apr 17 16:32:38.113153 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.113132 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/02278729-f9e5-4615-9be2-2f650d08b858-data-volume\") pod \"insights-runtime-extractor-wbbbm\" (UID: \"02278729-f9e5-4615-9be2-2f650d08b858\") " pod="openshift-insights/insights-runtime-extractor-wbbbm" Apr 17 16:32:38.113511 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.113494 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/995d1a32-09a5-4100-bd91-c6ccdf96086d-config-volume\") pod \"dns-default-bhgll\" (UID: \"995d1a32-09a5-4100-bd91-c6ccdf96086d\") " pod="openshift-dns/dns-default-bhgll" Apr 17 16:32:38.113575 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.113511 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/02278729-f9e5-4615-9be2-2f650d08b858-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wbbbm\" (UID: \"02278729-f9e5-4615-9be2-2f650d08b858\") " pod="openshift-insights/insights-runtime-extractor-wbbbm" Apr 17 16:32:38.116669 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.116650 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/995d1a32-09a5-4100-bd91-c6ccdf96086d-metrics-tls\") pod \"dns-default-bhgll\" (UID: \"995d1a32-09a5-4100-bd91-c6ccdf96086d\") " pod="openshift-dns/dns-default-bhgll" Apr 17 16:32:38.116669 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.116659 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/02278729-f9e5-4615-9be2-2f650d08b858-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wbbbm\" (UID: \"02278729-f9e5-4615-9be2-2f650d08b858\") " pod="openshift-insights/insights-runtime-extractor-wbbbm" Apr 17 16:32:38.120374 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.120350 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs6s8\" (UniqueName: \"kubernetes.io/projected/995d1a32-09a5-4100-bd91-c6ccdf96086d-kube-api-access-bs6s8\") pod \"dns-default-bhgll\" (UID: \"995d1a32-09a5-4100-bd91-c6ccdf96086d\") " pod="openshift-dns/dns-default-bhgll" Apr 17 16:32:38.120487 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.120469 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nps5\" (UniqueName: \"kubernetes.io/projected/02278729-f9e5-4615-9be2-2f650d08b858-kube-api-access-5nps5\") pod \"insights-runtime-extractor-wbbbm\" (UID: \"02278729-f9e5-4615-9be2-2f650d08b858\") " pod="openshift-insights/insights-runtime-extractor-wbbbm" Apr 17 16:32:38.183473 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.183441 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bhgll" Apr 17 16:32:38.203198 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.203169 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wbbbm" Apr 17 16:32:38.214226 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.214181 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28bd3d62-5065-4e51-a02d-686fb319fffc-cert\") pod \"ingress-canary-4jlzg\" (UID: \"28bd3d62-5065-4e51-a02d-686fb319fffc\") " pod="openshift-ingress-canary/ingress-canary-4jlzg" Apr 17 16:32:38.214338 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.214270 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7cpbt\" (UniqueName: \"kubernetes.io/projected/28bd3d62-5065-4e51-a02d-686fb319fffc-kube-api-access-7cpbt\") pod \"ingress-canary-4jlzg\" (UID: \"28bd3d62-5065-4e51-a02d-686fb319fffc\") " pod="openshift-ingress-canary/ingress-canary-4jlzg" Apr 17 16:32:38.216534 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.216514 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28bd3d62-5065-4e51-a02d-686fb319fffc-cert\") pod \"ingress-canary-4jlzg\" (UID: \"28bd3d62-5065-4e51-a02d-686fb319fffc\") " pod="openshift-ingress-canary/ingress-canary-4jlzg" Apr 17 16:32:38.222271 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.222224 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cpbt\" (UniqueName: \"kubernetes.io/projected/28bd3d62-5065-4e51-a02d-686fb319fffc-kube-api-access-7cpbt\") pod \"ingress-canary-4jlzg\" (UID: \"28bd3d62-5065-4e51-a02d-686fb319fffc\") " pod="openshift-ingress-canary/ingress-canary-4jlzg" Apr 17 16:32:38.242174 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.242139 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4jlzg" Apr 17 16:32:38.362951 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.362921 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bhgll"] Apr 17 16:32:38.366177 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:32:38.366153 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod995d1a32_09a5_4100_bd91_c6ccdf96086d.slice/crio-b27f9d7520aea5d225036fc6bec538eca7d1584f7c43aaab91ce8a953ddeb357 WatchSource:0}: Error finding container b27f9d7520aea5d225036fc6bec538eca7d1584f7c43aaab91ce8a953ddeb357: Status 404 returned error can't find the container with id b27f9d7520aea5d225036fc6bec538eca7d1584f7c43aaab91ce8a953ddeb357 Apr 17 16:32:38.579352 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.579280 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4jlzg"] Apr 17 16:32:38.582319 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:32:38.582290 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28bd3d62_5065_4e51_a02d_686fb319fffc.slice/crio-6e45910878a21703ad4440a95408cbe5c36ed6ab3d05fd0beaca769e169a6782 WatchSource:0}: Error finding container 6e45910878a21703ad4440a95408cbe5c36ed6ab3d05fd0beaca769e169a6782: Status 404 returned error can't find the container with id 6e45910878a21703ad4440a95408cbe5c36ed6ab3d05fd0beaca769e169a6782 Apr 17 16:32:38.582699 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:38.582679 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wbbbm"] Apr 17 16:32:38.585647 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:32:38.585624 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02278729_f9e5_4615_9be2_2f650d08b858.slice/crio-fa09987b051deee9afa180bb32ebe27abbabf9fca1ae1ac860c0c6fb14173601 WatchSource:0}: Error finding container fa09987b051deee9afa180bb32ebe27abbabf9fca1ae1ac860c0c6fb14173601: Status 404 returned error can't find the container with id fa09987b051deee9afa180bb32ebe27abbabf9fca1ae1ac860c0c6fb14173601 Apr 17 16:32:39.015634 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:39.015572 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-7wqsv"] Apr 17 16:32:39.034623 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:39.034582 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-7wqsv"] Apr 17 16:32:39.034796 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:39.034727 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-7wqsv" Apr 17 16:32:39.038987 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:39.038655 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-4mnhz\"" Apr 17 16:32:39.039133 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:39.039117 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 16:32:39.041915 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:39.039291 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 16:32:39.041915 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:39.039762 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 16:32:39.041915 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:39.040243 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 16:32:39.041915 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:39.040596 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 16:32:39.222419 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:39.222376 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/538fa6d8-c9c5-4f08-b49c-55184d52040e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-7wqsv\" (UID: \"538fa6d8-c9c5-4f08-b49c-55184d52040e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7wqsv" Apr 17 16:32:39.222596 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:39.222444 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/538fa6d8-c9c5-4f08-b49c-55184d52040e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-7wqsv\" (UID: \"538fa6d8-c9c5-4f08-b49c-55184d52040e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7wqsv" Apr 17 16:32:39.222596 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:39.222531 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cqq6\" (UniqueName: \"kubernetes.io/projected/538fa6d8-c9c5-4f08-b49c-55184d52040e-kube-api-access-6cqq6\") pod \"prometheus-operator-5676c8c784-7wqsv\" (UID: \"538fa6d8-c9c5-4f08-b49c-55184d52040e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7wqsv" Apr 17 16:32:39.222596 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:39.222581 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/538fa6d8-c9c5-4f08-b49c-55184d52040e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-7wqsv\" (UID: \"538fa6d8-c9c5-4f08-b49c-55184d52040e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7wqsv" Apr 17 16:32:39.260106 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:39.260065 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4jlzg" event={"ID":"28bd3d62-5065-4e51-a02d-686fb319fffc","Type":"ContainerStarted","Data":"6e45910878a21703ad4440a95408cbe5c36ed6ab3d05fd0beaca769e169a6782"} Apr 17 16:32:39.262050 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:39.262014 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wbbbm" event={"ID":"02278729-f9e5-4615-9be2-2f650d08b858","Type":"ContainerStarted","Data":"ff5bbbeb872a4d3739890068d385383db7c291fd4ffad8ae33284b8e8a891193"} Apr 17 16:32:39.262194 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:39.262054 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wbbbm" event={"ID":"02278729-f9e5-4615-9be2-2f650d08b858","Type":"ContainerStarted","Data":"fa09987b051deee9afa180bb32ebe27abbabf9fca1ae1ac860c0c6fb14173601"} Apr 17 16:32:39.263378 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:39.263349 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bhgll" event={"ID":"995d1a32-09a5-4100-bd91-c6ccdf96086d","Type":"ContainerStarted","Data":"b27f9d7520aea5d225036fc6bec538eca7d1584f7c43aaab91ce8a953ddeb357"} Apr 17 16:32:39.323671 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:39.323582 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cqq6\" (UniqueName: \"kubernetes.io/projected/538fa6d8-c9c5-4f08-b49c-55184d52040e-kube-api-access-6cqq6\") pod \"prometheus-operator-5676c8c784-7wqsv\" (UID: \"538fa6d8-c9c5-4f08-b49c-55184d52040e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7wqsv" Apr 17 16:32:39.323671 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:39.323634 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/538fa6d8-c9c5-4f08-b49c-55184d52040e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-7wqsv\" (UID: \"538fa6d8-c9c5-4f08-b49c-55184d52040e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7wqsv" Apr 17 16:32:39.323931 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:39.323696 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/538fa6d8-c9c5-4f08-b49c-55184d52040e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-7wqsv\" (UID: \"538fa6d8-c9c5-4f08-b49c-55184d52040e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7wqsv" Apr 17 16:32:39.323931 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:39.323739 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/538fa6d8-c9c5-4f08-b49c-55184d52040e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-7wqsv\" (UID: \"538fa6d8-c9c5-4f08-b49c-55184d52040e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7wqsv" Apr 17 16:32:39.323931 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:39.323859 2569 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 17 16:32:39.323931 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:39.323924 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/538fa6d8-c9c5-4f08-b49c-55184d52040e-prometheus-operator-tls podName:538fa6d8-c9c5-4f08-b49c-55184d52040e nodeName:}" failed. No retries permitted until 2026-04-17 16:32:39.823904071 +0000 UTC m=+57.346929633 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/538fa6d8-c9c5-4f08-b49c-55184d52040e-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-7wqsv" (UID: "538fa6d8-c9c5-4f08-b49c-55184d52040e") : secret "prometheus-operator-tls" not found Apr 17 16:32:39.324620 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:39.324589 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/538fa6d8-c9c5-4f08-b49c-55184d52040e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-7wqsv\" (UID: \"538fa6d8-c9c5-4f08-b49c-55184d52040e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7wqsv" Apr 17 16:32:39.326630 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:39.326602 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/538fa6d8-c9c5-4f08-b49c-55184d52040e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-7wqsv\" (UID: \"538fa6d8-c9c5-4f08-b49c-55184d52040e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7wqsv" Apr 17 16:32:39.335557 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:39.335531 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cqq6\" (UniqueName: \"kubernetes.io/projected/538fa6d8-c9c5-4f08-b49c-55184d52040e-kube-api-access-6cqq6\") pod \"prometheus-operator-5676c8c784-7wqsv\" (UID: \"538fa6d8-c9c5-4f08-b49c-55184d52040e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7wqsv" Apr 17 16:32:39.829210 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:39.829173 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/538fa6d8-c9c5-4f08-b49c-55184d52040e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-7wqsv\" (UID: \"538fa6d8-c9c5-4f08-b49c-55184d52040e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7wqsv" Apr 17 16:32:39.832050 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:39.832022 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/538fa6d8-c9c5-4f08-b49c-55184d52040e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-7wqsv\" (UID: \"538fa6d8-c9c5-4f08-b49c-55184d52040e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7wqsv" Apr 17 16:32:39.948364 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:39.948323 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-7wqsv" Apr 17 16:32:41.213575 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:41.213549 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-79ft9" Apr 17 16:32:41.405308 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:41.405276 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-7wqsv"] Apr 17 16:32:41.418575 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:32:41.418544 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod538fa6d8_c9c5_4f08_b49c_55184d52040e.slice/crio-4c2114a60c1b7d97861c48c956d93d0fcbc1043ec3de4d2d7c5fe0cf4978504b WatchSource:0}: Error finding container 4c2114a60c1b7d97861c48c956d93d0fcbc1043ec3de4d2d7c5fe0cf4978504b: Status 404 returned error can't find the container with id 4c2114a60c1b7d97861c48c956d93d0fcbc1043ec3de4d2d7c5fe0cf4978504b Apr 17 16:32:42.271007 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:42.270903 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4jlzg" event={"ID":"28bd3d62-5065-4e51-a02d-686fb319fffc","Type":"ContainerStarted","Data":"8260bfe8e703aeac44098431aa71110a08e96f6fd642a5e8dca58d949d2a8955"} Apr 17 16:32:42.272545 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:42.272515 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-7wqsv" event={"ID":"538fa6d8-c9c5-4f08-b49c-55184d52040e","Type":"ContainerStarted","Data":"4c2114a60c1b7d97861c48c956d93d0fcbc1043ec3de4d2d7c5fe0cf4978504b"} Apr 17 16:32:42.274280 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:42.274240 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wbbbm" event={"ID":"02278729-f9e5-4615-9be2-2f650d08b858","Type":"ContainerStarted","Data":"2028d8c9b181315a1373b5e92fbe6479748aefc1b77759cff0492b8d0859504c"} Apr 17 16:32:42.275897 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:42.275874 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bhgll" event={"ID":"995d1a32-09a5-4100-bd91-c6ccdf96086d","Type":"ContainerStarted","Data":"df151149192b40f7d97f48b2626351b9259c368556ee78967f780a0508568989"} Apr 17 16:32:42.275989 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:42.275906 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bhgll" event={"ID":"995d1a32-09a5-4100-bd91-c6ccdf96086d","Type":"ContainerStarted","Data":"6c3a6aecb65a5ba59eef2d204d483c467dbe6d44dc6756a6d7c4c8d606d63312"} Apr 17 16:32:42.276074 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:42.276055 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-bhgll" Apr 17 16:32:42.284813 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:42.284758 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4jlzg" podStartSLOduration=2.603448415 podStartE2EDuration="5.284744241s" podCreationTimestamp="2026-04-17 16:32:37 +0000 UTC" firstStartedPulling="2026-04-17 16:32:38.584652958 +0000 UTC m=+56.107678522" lastFinishedPulling="2026-04-17 16:32:41.265948778 +0000 UTC m=+58.788974348" observedRunningTime="2026-04-17 16:32:42.284735311 +0000 UTC m=+59.807760897" watchObservedRunningTime="2026-04-17 16:32:42.284744241 +0000 UTC m=+59.807769826" Apr 17 16:32:42.307946 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:42.307826 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-bhgll" podStartSLOduration=2.4096331539999998 podStartE2EDuration="5.307794875s" podCreationTimestamp="2026-04-17 16:32:37 +0000 UTC" firstStartedPulling="2026-04-17 16:32:38.367793005 +0000 UTC m=+55.890818571" lastFinishedPulling="2026-04-17 16:32:41.265954713 +0000 UTC m=+58.788980292" observedRunningTime="2026-04-17 16:32:42.301072843 +0000 UTC m=+59.824098429" watchObservedRunningTime="2026-04-17 16:32:42.307794875 +0000 UTC m=+59.830820459" Apr 17 16:32:44.281808 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:44.281768 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-7wqsv" event={"ID":"538fa6d8-c9c5-4f08-b49c-55184d52040e","Type":"ContainerStarted","Data":"2cb6eeb7e863991de84bbb589799c707075a0e36581646939416981be0d9ead4"} Apr 17 16:32:44.281808 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:44.281812 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-7wqsv" event={"ID":"538fa6d8-c9c5-4f08-b49c-55184d52040e","Type":"ContainerStarted","Data":"2f72302e46b00957bbe5037bad4e326163a7f319456519e460aedb1c5a5a740e"} Apr 17 16:32:44.283445 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:44.283420 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wbbbm" event={"ID":"02278729-f9e5-4615-9be2-2f650d08b858","Type":"ContainerStarted","Data":"1f7575483d7d209108a560bfb706fd01c72d2d357f7b7e4ca8dba45d712e2c6d"} Apr 17 16:32:44.298930 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:44.298887 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-7wqsv" podStartSLOduration=2.959516563 podStartE2EDuration="5.29887492s" podCreationTimestamp="2026-04-17 16:32:39 +0000 UTC" firstStartedPulling="2026-04-17 16:32:41.420579829 +0000 UTC m=+58.943605392" lastFinishedPulling="2026-04-17 16:32:43.759938183 +0000 UTC m=+61.282963749" observedRunningTime="2026-04-17 16:32:44.297933442 +0000 UTC m=+61.820959025" watchObservedRunningTime="2026-04-17 16:32:44.29887492 +0000 UTC m=+61.821900503" Apr 17 16:32:44.314240 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:44.314197 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-wbbbm" podStartSLOduration=2.282573778 podStartE2EDuration="7.31418506s" podCreationTimestamp="2026-04-17 16:32:37 +0000 UTC" firstStartedPulling="2026-04-17 16:32:38.725191801 +0000 UTC m=+56.248217364" lastFinishedPulling="2026-04-17 16:32:43.756803085 +0000 UTC m=+61.279828646" observedRunningTime="2026-04-17 16:32:44.313709186 +0000 UTC m=+61.836734770" watchObservedRunningTime="2026-04-17 16:32:44.31418506 +0000 UTC m=+61.837210681" Apr 17 16:32:46.355368 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.355324 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-svlf8"] Apr 17 16:32:46.358285 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.358270 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-svlf8" Apr 17 16:32:46.361308 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.361287 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 16:32:46.361435 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.361308 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 16:32:46.361551 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.361538 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-q22x2\"" Apr 17 16:32:46.368140 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.368117 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-svlf8"] Apr 17 16:32:46.370745 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.370726 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xw5t5"] Apr 17 16:32:46.373622 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.373605 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-xw5t5" Apr 17 16:32:46.376175 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.376154 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 16:32:46.376296 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.376177 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 16:32:46.376296 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.376191 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 16:32:46.376581 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.376566 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-4vx5x\"" Apr 17 16:32:46.378211 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.378188 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m2ss\" (UniqueName: \"kubernetes.io/projected/f4baaf28-021b-4c3a-bf16-ff044a443f99-kube-api-access-4m2ss\") pod \"openshift-state-metrics-9d44df66c-svlf8\" (UID: \"f4baaf28-021b-4c3a-bf16-ff044a443f99\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-svlf8" Apr 17 16:32:46.378340 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.378219 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f4baaf28-021b-4c3a-bf16-ff044a443f99-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-svlf8\" (UID: \"f4baaf28-021b-4c3a-bf16-ff044a443f99\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-svlf8" Apr 17 16:32:46.378340 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.378244 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0b16f4d3-198e-4627-a661-0da1d8f90ee9-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xw5t5\" (UID: \"0b16f4d3-198e-4627-a661-0da1d8f90ee9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xw5t5" Apr 17 16:32:46.378453 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.378408 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f4baaf28-021b-4c3a-bf16-ff044a443f99-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-svlf8\" (UID: \"f4baaf28-021b-4c3a-bf16-ff044a443f99\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-svlf8" Apr 17 16:32:46.378453 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.378444 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b16f4d3-198e-4627-a661-0da1d8f90ee9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xw5t5\" (UID: \"0b16f4d3-198e-4627-a661-0da1d8f90ee9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xw5t5" Apr 17 16:32:46.378554 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.378503 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0b16f4d3-198e-4627-a661-0da1d8f90ee9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xw5t5\" (UID: \"0b16f4d3-198e-4627-a661-0da1d8f90ee9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xw5t5" Apr 17 16:32:46.378554 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.378534 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/0b16f4d3-198e-4627-a661-0da1d8f90ee9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xw5t5\" (UID: \"0b16f4d3-198e-4627-a661-0da1d8f90ee9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xw5t5" Apr 17 16:32:46.378659 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.378559 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjpj9\" (UniqueName: \"kubernetes.io/projected/0b16f4d3-198e-4627-a661-0da1d8f90ee9-kube-api-access-cjpj9\") pod \"kube-state-metrics-69db897b98-xw5t5\" (UID: \"0b16f4d3-198e-4627-a661-0da1d8f90ee9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xw5t5" Apr 17 16:32:46.378659 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.378620 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/0b16f4d3-198e-4627-a661-0da1d8f90ee9-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xw5t5\" (UID: \"0b16f4d3-198e-4627-a661-0da1d8f90ee9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xw5t5" Apr 17 16:32:46.378659 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.378646 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f4baaf28-021b-4c3a-bf16-ff044a443f99-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-svlf8\" (UID: \"f4baaf28-021b-4c3a-bf16-ff044a443f99\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-svlf8" Apr 17 16:32:46.385338 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.385316 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xw5t5"] Apr 17 16:32:46.387765 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.387744 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-ssfqs"] Apr 17 16:32:46.390675 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.390657 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:46.393104 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.393088 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 16:32:46.393281 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.393264 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 16:32:46.393366 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.393342 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-pjgf8\"" Apr 17 16:32:46.393366 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.393352 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 16:32:46.479891 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.479860 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0b16f4d3-198e-4627-a661-0da1d8f90ee9-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xw5t5\" (UID: \"0b16f4d3-198e-4627-a661-0da1d8f90ee9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xw5t5" Apr 17 16:32:46.480075 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.479907 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/0b16f4d3-198e-4627-a661-0da1d8f90ee9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xw5t5\" (UID: \"0b16f4d3-198e-4627-a661-0da1d8f90ee9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xw5t5" Apr 17 16:32:46.480075 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.479932 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1eb03b6e-f205-4b1e-976b-27236f5e9e47-sys\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:46.480075 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.479981 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1eb03b6e-f205-4b1e-976b-27236f5e9e47-node-exporter-wtmp\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:46.480075 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.480006 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1eb03b6e-f205-4b1e-976b-27236f5e9e47-node-exporter-textfile\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:46.480075 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.480038 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f4baaf28-021b-4c3a-bf16-ff044a443f99-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-svlf8\" (UID: \"f4baaf28-021b-4c3a-bf16-ff044a443f99\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-svlf8" Apr 17 16:32:46.480322 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.480088 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1eb03b6e-f205-4b1e-976b-27236f5e9e47-root\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:46.480322 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.480130 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1eb03b6e-f205-4b1e-976b-27236f5e9e47-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:46.480322 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.480155 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fhmm\" (UniqueName: \"kubernetes.io/projected/1eb03b6e-f205-4b1e-976b-27236f5e9e47-kube-api-access-9fhmm\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:46.480322 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.480183 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0b16f4d3-198e-4627-a661-0da1d8f90ee9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xw5t5\" (UID: \"0b16f4d3-198e-4627-a661-0da1d8f90ee9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xw5t5" Apr 17 16:32:46.480322 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.480210 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjpj9\" (UniqueName: \"kubernetes.io/projected/0b16f4d3-198e-4627-a661-0da1d8f90ee9-kube-api-access-cjpj9\") pod \"kube-state-metrics-69db897b98-xw5t5\" (UID: \"0b16f4d3-198e-4627-a661-0da1d8f90ee9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xw5t5" Apr 17 16:32:46.480322 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.480285 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f4baaf28-021b-4c3a-bf16-ff044a443f99-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-svlf8\" (UID: \"f4baaf28-021b-4c3a-bf16-ff044a443f99\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-svlf8" Apr 17 16:32:46.480322 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.480315 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b16f4d3-198e-4627-a661-0da1d8f90ee9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xw5t5\" (UID: \"0b16f4d3-198e-4627-a661-0da1d8f90ee9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xw5t5" Apr 17 16:32:46.480684 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.480340 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/0b16f4d3-198e-4627-a661-0da1d8f90ee9-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xw5t5\" (UID: \"0b16f4d3-198e-4627-a661-0da1d8f90ee9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xw5t5" Apr 17 16:32:46.480684 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.480365 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1eb03b6e-f205-4b1e-976b-27236f5e9e47-node-exporter-tls\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:46.480684 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.480393 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1eb03b6e-f205-4b1e-976b-27236f5e9e47-metrics-client-ca\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:46.480684 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.480427 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f4baaf28-021b-4c3a-bf16-ff044a443f99-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-svlf8\" (UID: \"f4baaf28-021b-4c3a-bf16-ff044a443f99\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-svlf8" Apr 17 16:32:46.480684 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.480457 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1eb03b6e-f205-4b1e-976b-27236f5e9e47-node-exporter-accelerators-collector-config\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:46.480684 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.480490 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4m2ss\" (UniqueName: \"kubernetes.io/projected/f4baaf28-021b-4c3a-bf16-ff044a443f99-kube-api-access-4m2ss\") pod \"openshift-state-metrics-9d44df66c-svlf8\" (UID: \"f4baaf28-021b-4c3a-bf16-ff044a443f99\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-svlf8" Apr 17 16:32:46.480684 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:46.480545 2569 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 17 16:32:46.480684 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:46.480628 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b16f4d3-198e-4627-a661-0da1d8f90ee9-kube-state-metrics-tls podName:0b16f4d3-198e-4627-a661-0da1d8f90ee9 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:46.980608977 +0000 UTC m=+64.503634553 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/0b16f4d3-198e-4627-a661-0da1d8f90ee9-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-xw5t5" (UID: "0b16f4d3-198e-4627-a661-0da1d8f90ee9") : secret "kube-state-metrics-tls" not found Apr 17 16:32:46.481034 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.480696 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0b16f4d3-198e-4627-a661-0da1d8f90ee9-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xw5t5\" (UID: \"0b16f4d3-198e-4627-a661-0da1d8f90ee9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xw5t5" Apr 17 16:32:46.481034 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.480770 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f4baaf28-021b-4c3a-bf16-ff044a443f99-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-svlf8\" (UID: \"f4baaf28-021b-4c3a-bf16-ff044a443f99\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-svlf8" Apr 17 16:32:46.481034 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.480773 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/0b16f4d3-198e-4627-a661-0da1d8f90ee9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xw5t5\" (UID: \"0b16f4d3-198e-4627-a661-0da1d8f90ee9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xw5t5" Apr 17 16:32:46.481034 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.480856 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/0b16f4d3-198e-4627-a661-0da1d8f90ee9-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xw5t5\" (UID: \"0b16f4d3-198e-4627-a661-0da1d8f90ee9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xw5t5" Apr 17 16:32:46.481034 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:46.480428 2569 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 17 16:32:46.481034 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:46.480952 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4baaf28-021b-4c3a-bf16-ff044a443f99-openshift-state-metrics-tls podName:f4baaf28-021b-4c3a-bf16-ff044a443f99 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:46.980933822 +0000 UTC m=+64.503959398 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/f4baaf28-021b-4c3a-bf16-ff044a443f99-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-svlf8" (UID: "f4baaf28-021b-4c3a-bf16-ff044a443f99") : secret "openshift-state-metrics-tls" not found Apr 17 16:32:46.482735 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.482709 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0b16f4d3-198e-4627-a661-0da1d8f90ee9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xw5t5\" (UID: \"0b16f4d3-198e-4627-a661-0da1d8f90ee9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xw5t5" Apr 17 16:32:46.482895 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.482879 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f4baaf28-021b-4c3a-bf16-ff044a443f99-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-svlf8\" (UID: \"f4baaf28-021b-4c3a-bf16-ff044a443f99\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-svlf8" Apr 17 16:32:46.488982 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.488954 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjpj9\" (UniqueName: \"kubernetes.io/projected/0b16f4d3-198e-4627-a661-0da1d8f90ee9-kube-api-access-cjpj9\") pod \"kube-state-metrics-69db897b98-xw5t5\" (UID: \"0b16f4d3-198e-4627-a661-0da1d8f90ee9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xw5t5" Apr 17 16:32:46.489493 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.489471 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m2ss\" (UniqueName: \"kubernetes.io/projected/f4baaf28-021b-4c3a-bf16-ff044a443f99-kube-api-access-4m2ss\") pod \"openshift-state-metrics-9d44df66c-svlf8\" (UID: \"f4baaf28-021b-4c3a-bf16-ff044a443f99\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-svlf8" Apr 17 16:32:46.581703 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.581665 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1eb03b6e-f205-4b1e-976b-27236f5e9e47-sys\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:46.581881 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.581718 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1eb03b6e-f205-4b1e-976b-27236f5e9e47-node-exporter-wtmp\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:46.581881 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.581742 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1eb03b6e-f205-4b1e-976b-27236f5e9e47-node-exporter-textfile\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:46.581881 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.581772 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1eb03b6e-f205-4b1e-976b-27236f5e9e47-root\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:46.581881 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.581774 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1eb03b6e-f205-4b1e-976b-27236f5e9e47-sys\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:46.581881 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.581851 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1eb03b6e-f205-4b1e-976b-27236f5e9e47-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:46.582139 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.581871 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1eb03b6e-f205-4b1e-976b-27236f5e9e47-root\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:46.582139 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.581882 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9fhmm\" (UniqueName: \"kubernetes.io/projected/1eb03b6e-f205-4b1e-976b-27236f5e9e47-kube-api-access-9fhmm\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:46.582139 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.581922 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1eb03b6e-f205-4b1e-976b-27236f5e9e47-node-exporter-wtmp\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:46.582139 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.582026 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1eb03b6e-f205-4b1e-976b-27236f5e9e47-node-exporter-tls\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:46.582139 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.582060 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1eb03b6e-f205-4b1e-976b-27236f5e9e47-metrics-client-ca\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:46.582139 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.582096 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1eb03b6e-f205-4b1e-976b-27236f5e9e47-node-exporter-accelerators-collector-config\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:46.582139 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.582133 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1eb03b6e-f205-4b1e-976b-27236f5e9e47-node-exporter-textfile\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:46.582497 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:46.582150 2569 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 16:32:46.582497 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:46.582224 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eb03b6e-f205-4b1e-976b-27236f5e9e47-node-exporter-tls podName:1eb03b6e-f205-4b1e-976b-27236f5e9e47 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:47.082203222 +0000 UTC m=+64.605228785 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/1eb03b6e-f205-4b1e-976b-27236f5e9e47-node-exporter-tls") pod "node-exporter-ssfqs" (UID: "1eb03b6e-f205-4b1e-976b-27236f5e9e47") : secret "node-exporter-tls" not found Apr 17 16:32:46.582716 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.582692 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1eb03b6e-f205-4b1e-976b-27236f5e9e47-metrics-client-ca\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:46.582844 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.582697 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1eb03b6e-f205-4b1e-976b-27236f5e9e47-node-exporter-accelerators-collector-config\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:46.584545 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.584518 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1eb03b6e-f205-4b1e-976b-27236f5e9e47-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:46.593509 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.593476 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fhmm\" (UniqueName: \"kubernetes.io/projected/1eb03b6e-f205-4b1e-976b-27236f5e9e47-kube-api-access-9fhmm\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:46.986190 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.986147 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f4baaf28-021b-4c3a-bf16-ff044a443f99-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-svlf8\" (UID: \"f4baaf28-021b-4c3a-bf16-ff044a443f99\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-svlf8" Apr 17 16:32:46.986190 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.986192 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b16f4d3-198e-4627-a661-0da1d8f90ee9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xw5t5\" (UID: \"0b16f4d3-198e-4627-a661-0da1d8f90ee9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xw5t5" Apr 17 16:32:46.988597 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.988572 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f4baaf28-021b-4c3a-bf16-ff044a443f99-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-svlf8\" (UID: \"f4baaf28-021b-4c3a-bf16-ff044a443f99\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-svlf8" Apr 17 16:32:46.988719 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:46.988632 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b16f4d3-198e-4627-a661-0da1d8f90ee9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xw5t5\" (UID: \"0b16f4d3-198e-4627-a661-0da1d8f90ee9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xw5t5" Apr 17 16:32:47.086517 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.086487 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1eb03b6e-f205-4b1e-976b-27236f5e9e47-node-exporter-tls\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:47.088688 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.088670 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1eb03b6e-f205-4b1e-976b-27236f5e9e47-node-exporter-tls\") pod \"node-exporter-ssfqs\" (UID: \"1eb03b6e-f205-4b1e-976b-27236f5e9e47\") " pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:47.266871 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.266772 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-svlf8" Apr 17 16:32:47.281766 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.281731 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-xw5t5" Apr 17 16:32:47.298848 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.298814 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ssfqs" Apr 17 16:32:47.397116 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.396931 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-svlf8"] Apr 17 16:32:47.415855 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.415831 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xw5t5"] Apr 17 16:32:47.418690 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:32:47.418661 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b16f4d3_198e_4627_a661_0da1d8f90ee9.slice/crio-d31246ebcd5acf318e62cb6dad2525b066e994375cdac2c276e439da51949cbd WatchSource:0}: Error finding container d31246ebcd5acf318e62cb6dad2525b066e994375cdac2c276e439da51949cbd: Status 404 returned error can't find the container with id d31246ebcd5acf318e62cb6dad2525b066e994375cdac2c276e439da51949cbd Apr 17 16:32:47.434501 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.434479 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:32:47.439084 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.439066 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.441817 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.441795 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 16:32:47.441921 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.441826 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 16:32:47.441978 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.441795 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 16:32:47.442027 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.441826 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 16:32:47.442080 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.441795 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 16:32:47.442080 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.442046 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 16:32:47.442734 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.442720 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 16:32:47.442778 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.442744 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 16:32:47.442819 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.442809 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 16:32:47.443151 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.442835 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-46tcd\"" Apr 17 16:32:47.461093 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.461072 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:32:47.488612 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.488573 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-config-out\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.488721 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.488622 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.488721 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.488710 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.488836 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.488752 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.488836 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.488810 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-config-volume\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.488921 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.488856 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.488921 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.488906 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68d65\" (UniqueName: \"kubernetes.io/projected/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-kube-api-access-68d65\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.489066 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.488935 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.489066 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.489062 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.489171 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.489091 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-web-config\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.489171 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.489117 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.489171 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.489144 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.489480 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.489172 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.589832 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.589794 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.589998 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.589843 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-config-volume\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.589998 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.589870 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.589998 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.589905 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68d65\" (UniqueName: \"kubernetes.io/projected/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-kube-api-access-68d65\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.589998 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.589952 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.590171 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:32:47.590002 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-alertmanager-trusted-ca-bundle podName:b146f1f0-bf94-4d1a-9a88-bd721fe8e564 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:48.089975383 +0000 UTC m=+65.613000962 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "b146f1f0-bf94-4d1a-9a88-bd721fe8e564") : configmap references non-existent config key: ca-bundle.crt Apr 17 16:32:47.590171 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.590072 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.590171 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.590103 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-web-config\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.590171 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.590130 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.590171 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.590161 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.590475 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.590359 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.590475 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.590443 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-config-out\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.590577 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.590476 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.590577 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.590511 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.591290 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.590969 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.591789 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.591760 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.592966 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.592941 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-config-volume\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.593266 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.593200 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.593564 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.593533 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.593675 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.593654 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.593737 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.593660 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-config-out\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.593942 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.593918 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.594036 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.593990 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-web-config\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.594271 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.594234 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.595209 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.595187 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:47.598514 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:47.598493 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68d65\" (UniqueName: \"kubernetes.io/projected/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-kube-api-access-68d65\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:48.094946 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.094902 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:48.095945 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.095918 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:48.297679 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.297635 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ssfqs" event={"ID":"1eb03b6e-f205-4b1e-976b-27236f5e9e47","Type":"ContainerStarted","Data":"56d61060fad9ad6df17e2814bf30a7f5e2360f841c1634270a7ecfe8bca03d6f"} Apr 17 16:32:48.297804 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.297693 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ssfqs" event={"ID":"1eb03b6e-f205-4b1e-976b-27236f5e9e47","Type":"ContainerStarted","Data":"dde8d4344ca20455a958e52685155471c5545c22bb9857d8038ca327cbe3ff42"} Apr 17 16:32:48.299054 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.299018 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xw5t5" event={"ID":"0b16f4d3-198e-4627-a661-0da1d8f90ee9","Type":"ContainerStarted","Data":"d31246ebcd5acf318e62cb6dad2525b066e994375cdac2c276e439da51949cbd"} Apr 17 16:32:48.300983 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.300954 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-svlf8" event={"ID":"f4baaf28-021b-4c3a-bf16-ff044a443f99","Type":"ContainerStarted","Data":"63a435bd23d191329215005510af957b150c88415126dab7b63877639c9390ae"} Apr 17 16:32:48.300983 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.300989 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-svlf8" event={"ID":"f4baaf28-021b-4c3a-bf16-ff044a443f99","Type":"ContainerStarted","Data":"ee3a5fd0f7131142c3d4140060e34a61cf0cdb408badb90347eef405ac6b5945"} Apr 17 16:32:48.301134 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.301002 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-svlf8" event={"ID":"f4baaf28-021b-4c3a-bf16-ff044a443f99","Type":"ContainerStarted","Data":"68662ba342331e708b7e7f20a2af4a6cd924cdc194541e2b19231920b376b233"} Apr 17 16:32:48.330652 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.330622 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-85f4f855f9-2dwwm"] Apr 17 16:32:48.334508 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.334487 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:32:48.337141 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.337116 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 16:32:48.337271 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.337153 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 16:32:48.337271 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.337153 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-d7jrr5uaanp84\"" Apr 17 16:32:48.337388 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.337271 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-7n8jr\"" Apr 17 16:32:48.337540 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.337523 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 16:32:48.337625 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.337590 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 16:32:48.337703 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.337691 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 16:32:48.345228 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.345167 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-85f4f855f9-2dwwm"] Apr 17 16:32:48.354131 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.354111 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:32:48.397614 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.397581 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/528a342b-53f1-4c6d-a19d-b68a3684d7d0-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-85f4f855f9-2dwwm\" (UID: \"528a342b-53f1-4c6d-a19d-b68a3684d7d0\") " pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:32:48.397966 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.397624 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/528a342b-53f1-4c6d-a19d-b68a3684d7d0-secret-thanos-querier-tls\") pod \"thanos-querier-85f4f855f9-2dwwm\" (UID: \"528a342b-53f1-4c6d-a19d-b68a3684d7d0\") " pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:32:48.397966 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.397655 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t695r\" (UniqueName: \"kubernetes.io/projected/528a342b-53f1-4c6d-a19d-b68a3684d7d0-kube-api-access-t695r\") pod \"thanos-querier-85f4f855f9-2dwwm\" (UID: \"528a342b-53f1-4c6d-a19d-b68a3684d7d0\") " pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:32:48.397966 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.397676 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/528a342b-53f1-4c6d-a19d-b68a3684d7d0-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-85f4f855f9-2dwwm\" (UID: \"528a342b-53f1-4c6d-a19d-b68a3684d7d0\") " pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:32:48.397966 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.397692 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/528a342b-53f1-4c6d-a19d-b68a3684d7d0-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-85f4f855f9-2dwwm\" (UID: \"528a342b-53f1-4c6d-a19d-b68a3684d7d0\") " pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:32:48.397966 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.397771 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/528a342b-53f1-4c6d-a19d-b68a3684d7d0-metrics-client-ca\") pod \"thanos-querier-85f4f855f9-2dwwm\" (UID: \"528a342b-53f1-4c6d-a19d-b68a3684d7d0\") " pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:32:48.397966 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.397921 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/528a342b-53f1-4c6d-a19d-b68a3684d7d0-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-85f4f855f9-2dwwm\" (UID: \"528a342b-53f1-4c6d-a19d-b68a3684d7d0\") " pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:32:48.397966 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.397951 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/528a342b-53f1-4c6d-a19d-b68a3684d7d0-secret-grpc-tls\") pod \"thanos-querier-85f4f855f9-2dwwm\" (UID: \"528a342b-53f1-4c6d-a19d-b68a3684d7d0\") " pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:32:48.498973 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.498938 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/528a342b-53f1-4c6d-a19d-b68a3684d7d0-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-85f4f855f9-2dwwm\" (UID: \"528a342b-53f1-4c6d-a19d-b68a3684d7d0\") " pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:32:48.499114 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.498993 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/528a342b-53f1-4c6d-a19d-b68a3684d7d0-secret-grpc-tls\") pod \"thanos-querier-85f4f855f9-2dwwm\" (UID: \"528a342b-53f1-4c6d-a19d-b68a3684d7d0\") " pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:32:48.499114 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.499045 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/528a342b-53f1-4c6d-a19d-b68a3684d7d0-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-85f4f855f9-2dwwm\" (UID: \"528a342b-53f1-4c6d-a19d-b68a3684d7d0\") " pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:32:48.499114 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.499073 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/528a342b-53f1-4c6d-a19d-b68a3684d7d0-secret-thanos-querier-tls\") pod \"thanos-querier-85f4f855f9-2dwwm\" (UID: \"528a342b-53f1-4c6d-a19d-b68a3684d7d0\") " pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:32:48.499114 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.499098 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t695r\" (UniqueName: \"kubernetes.io/projected/528a342b-53f1-4c6d-a19d-b68a3684d7d0-kube-api-access-t695r\") pod \"thanos-querier-85f4f855f9-2dwwm\" (UID: \"528a342b-53f1-4c6d-a19d-b68a3684d7d0\") " pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:32:48.499325 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.499131 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/528a342b-53f1-4c6d-a19d-b68a3684d7d0-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-85f4f855f9-2dwwm\" (UID: \"528a342b-53f1-4c6d-a19d-b68a3684d7d0\") " pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:32:48.499325 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.499157 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/528a342b-53f1-4c6d-a19d-b68a3684d7d0-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-85f4f855f9-2dwwm\" (UID: \"528a342b-53f1-4c6d-a19d-b68a3684d7d0\") " pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:32:48.499325 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.499212 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/528a342b-53f1-4c6d-a19d-b68a3684d7d0-metrics-client-ca\") pod \"thanos-querier-85f4f855f9-2dwwm\" (UID: \"528a342b-53f1-4c6d-a19d-b68a3684d7d0\") " pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:32:48.499959 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.499909 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/528a342b-53f1-4c6d-a19d-b68a3684d7d0-metrics-client-ca\") pod \"thanos-querier-85f4f855f9-2dwwm\" (UID: \"528a342b-53f1-4c6d-a19d-b68a3684d7d0\") " pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:32:48.501883 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.501860 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/528a342b-53f1-4c6d-a19d-b68a3684d7d0-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-85f4f855f9-2dwwm\" (UID: \"528a342b-53f1-4c6d-a19d-b68a3684d7d0\") " pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:32:48.502367 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.502336 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/528a342b-53f1-4c6d-a19d-b68a3684d7d0-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-85f4f855f9-2dwwm\" (UID: \"528a342b-53f1-4c6d-a19d-b68a3684d7d0\") " pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:32:48.502539 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.502465 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/528a342b-53f1-4c6d-a19d-b68a3684d7d0-secret-grpc-tls\") pod \"thanos-querier-85f4f855f9-2dwwm\" (UID: \"528a342b-53f1-4c6d-a19d-b68a3684d7d0\") " pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:32:48.503040 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.502658 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/528a342b-53f1-4c6d-a19d-b68a3684d7d0-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-85f4f855f9-2dwwm\" (UID: \"528a342b-53f1-4c6d-a19d-b68a3684d7d0\") " pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:32:48.503040 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.502984 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/528a342b-53f1-4c6d-a19d-b68a3684d7d0-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-85f4f855f9-2dwwm\" (UID: \"528a342b-53f1-4c6d-a19d-b68a3684d7d0\") " pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:32:48.503040 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.503027 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/528a342b-53f1-4c6d-a19d-b68a3684d7d0-secret-thanos-querier-tls\") pod \"thanos-querier-85f4f855f9-2dwwm\" (UID: \"528a342b-53f1-4c6d-a19d-b68a3684d7d0\") " pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:32:48.507695 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.507675 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t695r\" (UniqueName: \"kubernetes.io/projected/528a342b-53f1-4c6d-a19d-b68a3684d7d0-kube-api-access-t695r\") pod \"thanos-querier-85f4f855f9-2dwwm\" (UID: \"528a342b-53f1-4c6d-a19d-b68a3684d7d0\") " pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:32:48.645115 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.645075 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:32:48.701306 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.701264 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4666c56f-3d86-4e16-a782-6a41f0fe8825-metrics-certs\") pod \"network-metrics-daemon-vtq9t\" (UID: \"4666c56f-3d86-4e16-a782-6a41f0fe8825\") " pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:32:48.704102 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.704073 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:32:48.714595 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.714561 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4666c56f-3d86-4e16-a782-6a41f0fe8825-metrics-certs\") pod \"network-metrics-daemon-vtq9t\" (UID: \"4666c56f-3d86-4e16-a782-6a41f0fe8825\") " pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:32:48.852964 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.852932 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-6m66z\"" Apr 17 16:32:48.860859 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.860838 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vtq9t" Apr 17 16:32:48.902983 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.902892 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn6cw\" (UniqueName: \"kubernetes.io/projected/290ef757-149c-497a-85e3-cc6a8cd8fc45-kube-api-access-sn6cw\") pod \"network-check-target-hdwf7\" (UID: \"290ef757-149c-497a-85e3-cc6a8cd8fc45\") " pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:32:48.905690 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.905469 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:32:48.916382 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.916121 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:32:48.927667 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:48.927604 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn6cw\" (UniqueName: \"kubernetes.io/projected/290ef757-149c-497a-85e3-cc6a8cd8fc45-kube-api-access-sn6cw\") pod \"network-check-target-hdwf7\" (UID: \"290ef757-149c-497a-85e3-cc6a8cd8fc45\") " pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:32:49.062941 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:49.062911 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:32:49.090380 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:49.090354 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-85f4f855f9-2dwwm"] Apr 17 16:32:49.095904 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:32:49.095880 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod528a342b_53f1_4c6d_a19d_b68a3684d7d0.slice/crio-58a809b8fd0c59a77efefd8dd90545b9fd15d648d8fbaf95b544a1d088abf224 WatchSource:0}: Error finding container 58a809b8fd0c59a77efefd8dd90545b9fd15d648d8fbaf95b544a1d088abf224: Status 404 returned error can't find the container with id 58a809b8fd0c59a77efefd8dd90545b9fd15d648d8fbaf95b544a1d088abf224 Apr 17 16:32:49.106334 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:49.106294 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vtq9t"] Apr 17 16:32:49.148804 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:49.148775 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fwf7s\"" Apr 17 16:32:49.156818 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:49.156757 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:32:49.293635 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:49.293521 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hdwf7"] Apr 17 16:32:49.297387 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:32:49.297362 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod290ef757_149c_497a_85e3_cc6a8cd8fc45.slice/crio-96aa75a44a6d2c37bb41e1cdcfbd7962b2464a071f80a1f41d7f0894f2c49c50 WatchSource:0}: Error finding container 96aa75a44a6d2c37bb41e1cdcfbd7962b2464a071f80a1f41d7f0894f2c49c50: Status 404 returned error can't find the container with id 96aa75a44a6d2c37bb41e1cdcfbd7962b2464a071f80a1f41d7f0894f2c49c50 Apr 17 16:32:49.305331 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:49.305307 2569 generic.go:358] "Generic (PLEG): container finished" podID="1eb03b6e-f205-4b1e-976b-27236f5e9e47" containerID="56d61060fad9ad6df17e2814bf30a7f5e2360f841c1634270a7ecfe8bca03d6f" exitCode=0 Apr 17 16:32:49.305432 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:49.305387 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ssfqs" event={"ID":"1eb03b6e-f205-4b1e-976b-27236f5e9e47","Type":"ContainerDied","Data":"56d61060fad9ad6df17e2814bf30a7f5e2360f841c1634270a7ecfe8bca03d6f"} Apr 17 16:32:49.307808 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:49.307775 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xw5t5" event={"ID":"0b16f4d3-198e-4627-a661-0da1d8f90ee9","Type":"ContainerStarted","Data":"73aa38b878ea14f9dc274a9359be8b3ada98e4f4509c5ccd81d24af453150944"} Apr 17 16:32:49.307901 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:49.307810 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xw5t5" event={"ID":"0b16f4d3-198e-4627-a661-0da1d8f90ee9","Type":"ContainerStarted","Data":"8edfc9b1e36c43b5758f4c8ef3598469d33b31f424c9a2a131f28b2796c0fba4"} Apr 17 16:32:49.307901 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:49.307823 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xw5t5" event={"ID":"0b16f4d3-198e-4627-a661-0da1d8f90ee9","Type":"ContainerStarted","Data":"bbd3c37c5783f71ac4648673b1e553d41797c96f6eade5faab6f91fc6a30c8f6"} Apr 17 16:32:49.309931 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:49.309913 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-svlf8" event={"ID":"f4baaf28-021b-4c3a-bf16-ff044a443f99","Type":"ContainerStarted","Data":"f2dd53b65c5211aa33dde0f08a3ac653ed2971575a0c6c20b37b918b2e6c5ab8"} Apr 17 16:32:49.311356 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:49.311323 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hdwf7" event={"ID":"290ef757-149c-497a-85e3-cc6a8cd8fc45","Type":"ContainerStarted","Data":"96aa75a44a6d2c37bb41e1cdcfbd7962b2464a071f80a1f41d7f0894f2c49c50"} Apr 17 16:32:49.312529 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:49.312498 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vtq9t" event={"ID":"4666c56f-3d86-4e16-a782-6a41f0fe8825","Type":"ContainerStarted","Data":"07343a9752e9606a8f908171b71760da8601f24516f749e4ea6b4c951bcc43ca"} Apr 17 16:32:49.313621 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:49.313597 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b146f1f0-bf94-4d1a-9a88-bd721fe8e564","Type":"ContainerStarted","Data":"0857aa699aed96557e009c6c980c943006b38ec6d5f80e46a6cefb9a139c9d6e"} Apr 17 16:32:49.314661 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:49.314643 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" event={"ID":"528a342b-53f1-4c6d-a19d-b68a3684d7d0","Type":"ContainerStarted","Data":"58a809b8fd0c59a77efefd8dd90545b9fd15d648d8fbaf95b544a1d088abf224"} Apr 17 16:32:49.349969 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:49.349840 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-svlf8" podStartSLOduration=2.000024681 podStartE2EDuration="3.349821413s" podCreationTimestamp="2026-04-17 16:32:46 +0000 UTC" firstStartedPulling="2026-04-17 16:32:47.542069761 +0000 UTC m=+65.065095323" lastFinishedPulling="2026-04-17 16:32:48.891866493 +0000 UTC m=+66.414892055" observedRunningTime="2026-04-17 16:32:49.349170248 +0000 UTC m=+66.872195833" watchObservedRunningTime="2026-04-17 16:32:49.349821413 +0000 UTC m=+66.872846999" Apr 17 16:32:49.376690 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:49.376624 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-xw5t5" podStartSLOduration=1.903748354 podStartE2EDuration="3.376604s" podCreationTimestamp="2026-04-17 16:32:46 +0000 UTC" firstStartedPulling="2026-04-17 16:32:47.420551533 +0000 UTC m=+64.943577099" lastFinishedPulling="2026-04-17 16:32:48.893407171 +0000 UTC m=+66.416432745" observedRunningTime="2026-04-17 16:32:49.374882521 +0000 UTC m=+66.897908110" watchObservedRunningTime="2026-04-17 16:32:49.376604 +0000 UTC m=+66.899629796" Apr 17 16:32:50.321175 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:50.321065 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ssfqs" event={"ID":"1eb03b6e-f205-4b1e-976b-27236f5e9e47","Type":"ContainerStarted","Data":"6a34e995a1018061a203038ea7c5c193d99cbbe7bb4b32b121cd17820494792c"} Apr 17 16:32:50.321175 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:50.321116 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ssfqs" event={"ID":"1eb03b6e-f205-4b1e-976b-27236f5e9e47","Type":"ContainerStarted","Data":"56466c7108cb595d26ba7aebf3a2a415a0a5fb4b6840e7c2e1486b171e212dd1"} Apr 17 16:32:50.343885 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:50.343639 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-ssfqs" podStartSLOduration=3.445112335 podStartE2EDuration="4.343618435s" podCreationTimestamp="2026-04-17 16:32:46 +0000 UTC" firstStartedPulling="2026-04-17 16:32:47.31114761 +0000 UTC m=+64.834173186" lastFinishedPulling="2026-04-17 16:32:48.209653724 +0000 UTC m=+65.732679286" observedRunningTime="2026-04-17 16:32:50.343244366 +0000 UTC m=+67.866270015" watchObservedRunningTime="2026-04-17 16:32:50.343618435 +0000 UTC m=+67.866644021" Apr 17 16:32:51.150333 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.150291 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-65dd5ccdb8-mdkmx"] Apr 17 16:32:51.153998 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.153970 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:32:51.157306 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.157064 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 16:32:51.157306 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.157092 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 16:32:51.157306 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.157070 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 16:32:51.158650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.158447 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-4nh7g\"" Apr 17 16:32:51.158650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.158487 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 16:32:51.158650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.158512 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 16:32:51.158650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.158446 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 16:32:51.158934 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.158673 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 16:32:51.164516 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.164496 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 16:32:51.168323 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.168300 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65dd5ccdb8-mdkmx"] Apr 17 16:32:51.223657 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.223616 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt2fp\" (UniqueName: \"kubernetes.io/projected/052b105d-312b-43e7-8b4b-e38f8fc75abf-kube-api-access-wt2fp\") pod \"console-65dd5ccdb8-mdkmx\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:32:51.223657 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.223662 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/052b105d-312b-43e7-8b4b-e38f8fc75abf-trusted-ca-bundle\") pod \"console-65dd5ccdb8-mdkmx\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:32:51.223879 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.223771 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/052b105d-312b-43e7-8b4b-e38f8fc75abf-console-config\") pod \"console-65dd5ccdb8-mdkmx\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:32:51.223879 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.223826 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/052b105d-312b-43e7-8b4b-e38f8fc75abf-service-ca\") pod \"console-65dd5ccdb8-mdkmx\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:32:51.224002 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.223885 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/052b105d-312b-43e7-8b4b-e38f8fc75abf-console-serving-cert\") pod \"console-65dd5ccdb8-mdkmx\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:32:51.224002 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.223929 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/052b105d-312b-43e7-8b4b-e38f8fc75abf-oauth-serving-cert\") pod \"console-65dd5ccdb8-mdkmx\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:32:51.224002 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.223966 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/052b105d-312b-43e7-8b4b-e38f8fc75abf-console-oauth-config\") pod \"console-65dd5ccdb8-mdkmx\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:32:51.324652 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.324619 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/052b105d-312b-43e7-8b4b-e38f8fc75abf-trusted-ca-bundle\") pod \"console-65dd5ccdb8-mdkmx\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:32:51.325089 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.324686 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/052b105d-312b-43e7-8b4b-e38f8fc75abf-console-config\") pod \"console-65dd5ccdb8-mdkmx\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:32:51.325089 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.324711 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/052b105d-312b-43e7-8b4b-e38f8fc75abf-service-ca\") pod \"console-65dd5ccdb8-mdkmx\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:32:51.325089 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.324745 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/052b105d-312b-43e7-8b4b-e38f8fc75abf-console-serving-cert\") pod \"console-65dd5ccdb8-mdkmx\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:32:51.325089 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.324784 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/052b105d-312b-43e7-8b4b-e38f8fc75abf-oauth-serving-cert\") pod \"console-65dd5ccdb8-mdkmx\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:32:51.325089 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.324814 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/052b105d-312b-43e7-8b4b-e38f8fc75abf-console-oauth-config\") pod \"console-65dd5ccdb8-mdkmx\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:32:51.325089 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.324937 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wt2fp\" (UniqueName: \"kubernetes.io/projected/052b105d-312b-43e7-8b4b-e38f8fc75abf-kube-api-access-wt2fp\") pod \"console-65dd5ccdb8-mdkmx\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:32:51.325749 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.325697 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/052b105d-312b-43e7-8b4b-e38f8fc75abf-trusted-ca-bundle\") pod \"console-65dd5ccdb8-mdkmx\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:32:51.326075 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.326054 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/052b105d-312b-43e7-8b4b-e38f8fc75abf-console-config\") pod \"console-65dd5ccdb8-mdkmx\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:32:51.326528 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.326507 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/052b105d-312b-43e7-8b4b-e38f8fc75abf-service-ca\") pod \"console-65dd5ccdb8-mdkmx\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:32:51.326625 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.326548 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/052b105d-312b-43e7-8b4b-e38f8fc75abf-oauth-serving-cert\") pod \"console-65dd5ccdb8-mdkmx\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:32:51.328587 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.328563 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/052b105d-312b-43e7-8b4b-e38f8fc75abf-console-oauth-config\") pod \"console-65dd5ccdb8-mdkmx\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:32:51.328920 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.328901 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/052b105d-312b-43e7-8b4b-e38f8fc75abf-console-serving-cert\") pod \"console-65dd5ccdb8-mdkmx\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:32:51.334299 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.334276 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt2fp\" (UniqueName: \"kubernetes.io/projected/052b105d-312b-43e7-8b4b-e38f8fc75abf-kube-api-access-wt2fp\") pod \"console-65dd5ccdb8-mdkmx\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:32:51.468507 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.468470 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:32:51.609309 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:51.609228 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65dd5ccdb8-mdkmx"] Apr 17 16:32:52.280670 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.280640 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-bhgll" Apr 17 16:32:52.329670 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.329632 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" event={"ID":"528a342b-53f1-4c6d-a19d-b68a3684d7d0","Type":"ContainerStarted","Data":"592bd5d5974d4ae3fa50a2478fd0011e8fed8358760ae57f4cff86fc8635ce2d"} Apr 17 16:32:52.330107 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.329680 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" event={"ID":"528a342b-53f1-4c6d-a19d-b68a3684d7d0","Type":"ContainerStarted","Data":"b309e2749b9083daf30d041da05a2dd04c76a81fb9f206da35d543df05440589"} Apr 17 16:32:52.331370 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.331337 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vtq9t" event={"ID":"4666c56f-3d86-4e16-a782-6a41f0fe8825","Type":"ContainerStarted","Data":"39c671781dca50c1abcc8b423f5b024035259f7f7a4fc03026d054716eb0b503"} Apr 17 16:32:52.333221 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.333177 2569 generic.go:358] "Generic (PLEG): container finished" podID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerID="557ab999bed69c5d82eb449e0414483b1e79a53a01c3e383f9dad5750397736f" exitCode=0 Apr 17 16:32:52.333379 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.333328 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b146f1f0-bf94-4d1a-9a88-bd721fe8e564","Type":"ContainerDied","Data":"557ab999bed69c5d82eb449e0414483b1e79a53a01c3e383f9dad5750397736f"} Apr 17 16:32:52.584831 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.584740 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:32:52.594883 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.594502 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.597875 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.597643 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 16:32:52.599894 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.599313 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 16:32:52.599894 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.599532 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 16:32:52.599894 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.599753 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 16:32:52.600180 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.600008 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 16:32:52.600180 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.600056 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 16:32:52.600180 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.600089 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 16:32:52.600180 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.600189 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-jpnr9\"" Apr 17 16:32:52.600434 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.600327 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 16:32:52.600511 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.600456 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 16:32:52.600596 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.600577 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 16:32:52.600682 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.600647 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-dm68br4pi30vp\"" Apr 17 16:32:52.600682 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.600669 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 16:32:52.600868 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.600690 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 16:32:52.602144 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.602122 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 16:32:52.608380 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.607627 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:32:52.613494 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:32:52.613468 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod052b105d_312b_43e7_8b4b_e38f8fc75abf.slice/crio-5c7a9c52f45e48643383fa2ba6260cec7762bef97d63a29efcb7d7a71f2f730e WatchSource:0}: Error finding container 5c7a9c52f45e48643383fa2ba6260cec7762bef97d63a29efcb7d7a71f2f730e: Status 404 returned error can't find the container with id 5c7a9c52f45e48643383fa2ba6260cec7762bef97d63a29efcb7d7a71f2f730e Apr 17 16:32:52.636765 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.636737 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-config\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.636899 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.636826 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.636972 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.636904 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.636972 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.636957 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.637070 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.636986 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.637070 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.637045 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.637149 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.637091 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.637149 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.637118 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.637235 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.637195 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mgkn\" (UniqueName: \"kubernetes.io/projected/1a6d1365-1408-4246-b172-881b465eddcc-kube-api-access-4mgkn\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.637311 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.637277 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.637311 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.637305 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a6d1365-1408-4246-b172-881b465eddcc-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.637410 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.637327 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.637410 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.637355 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.637498 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.637475 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-web-config\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.637579 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.637550 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1a6d1365-1408-4246-b172-881b465eddcc-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.637654 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.637639 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a6d1365-1408-4246-b172-881b465eddcc-config-out\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.637725 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.637708 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.637778 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.637739 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.739179 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.739044 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mgkn\" (UniqueName: \"kubernetes.io/projected/1a6d1365-1408-4246-b172-881b465eddcc-kube-api-access-4mgkn\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.739179 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.739114 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.739179 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.739145 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a6d1365-1408-4246-b172-881b465eddcc-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.739492 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.739183 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.740276 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.739300 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.740276 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.739964 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-web-config\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.740276 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.739968 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.740276 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.739992 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1a6d1365-1408-4246-b172-881b465eddcc-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.740276 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.740031 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a6d1365-1408-4246-b172-881b465eddcc-config-out\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.740276 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.740059 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.740276 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.740084 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.740276 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.740118 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-config\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.740276 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.740150 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.740276 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.740181 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.740276 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.740210 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.740276 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.740235 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.741462 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.740954 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.741462 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.741008 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.741462 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.741039 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.741462 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.741429 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.742848 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.742513 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1a6d1365-1408-4246-b172-881b465eddcc-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.743948 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.743600 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.743948 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.743900 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.747588 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.744538 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.747588 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.744863 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a6d1365-1408-4246-b172-881b465eddcc-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.747588 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.745479 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.747588 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.746829 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.747855 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.747618 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.748184 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.748135 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.748873 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.748827 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a6d1365-1408-4246-b172-881b465eddcc-config-out\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.749119 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.748972 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-config\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.749119 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.749057 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.749275 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.749149 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-web-config\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.749275 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.749225 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.750071 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.750051 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mgkn\" (UniqueName: \"kubernetes.io/projected/1a6d1365-1408-4246-b172-881b465eddcc-kube-api-access-4mgkn\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.750787 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.750760 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:52.910944 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:52.910899 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:32:53.060394 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:53.060365 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:32:53.063004 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:32:53.062969 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a6d1365_1408_4246_b172_881b465eddcc.slice/crio-cda95809aac2d0f1a33ff643c9fb99c947b2e3c692245dba63aa010bf946e38e WatchSource:0}: Error finding container cda95809aac2d0f1a33ff643c9fb99c947b2e3c692245dba63aa010bf946e38e: Status 404 returned error can't find the container with id cda95809aac2d0f1a33ff643c9fb99c947b2e3c692245dba63aa010bf946e38e Apr 17 16:32:53.341640 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:53.341596 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hdwf7" event={"ID":"290ef757-149c-497a-85e3-cc6a8cd8fc45","Type":"ContainerStarted","Data":"cebb35c649fe9cc9b28506bf8d06d219e656ea08afae36ca8e46da807be54b3a"} Apr 17 16:32:53.342063 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:53.341836 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:32:53.343383 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:53.343352 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65dd5ccdb8-mdkmx" event={"ID":"052b105d-312b-43e7-8b4b-e38f8fc75abf","Type":"ContainerStarted","Data":"5c7a9c52f45e48643383fa2ba6260cec7762bef97d63a29efcb7d7a71f2f730e"} Apr 17 16:32:53.345790 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:53.345403 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vtq9t" event={"ID":"4666c56f-3d86-4e16-a782-6a41f0fe8825","Type":"ContainerStarted","Data":"99c5c0adb364893b725fa8fd82de8e2b18e7714c65020767ff7f541008c8d120"} Apr 17 16:32:53.347391 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:53.347364 2569 generic.go:358] "Generic (PLEG): container finished" podID="1a6d1365-1408-4246-b172-881b465eddcc" containerID="59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5" exitCode=0 Apr 17 16:32:53.347536 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:53.347513 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a6d1365-1408-4246-b172-881b465eddcc","Type":"ContainerDied","Data":"59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5"} Apr 17 16:32:53.347606 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:53.347546 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a6d1365-1408-4246-b172-881b465eddcc","Type":"ContainerStarted","Data":"cda95809aac2d0f1a33ff643c9fb99c947b2e3c692245dba63aa010bf946e38e"} Apr 17 16:32:53.350224 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:53.350191 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" event={"ID":"528a342b-53f1-4c6d-a19d-b68a3684d7d0","Type":"ContainerStarted","Data":"092426b52e2b45378e81c321fc2bb8d8b9bf3b75fac11c735255a2704b794b95"} Apr 17 16:32:53.359113 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:53.358972 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-hdwf7" podStartSLOduration=66.974209555 podStartE2EDuration="1m10.358957077s" podCreationTimestamp="2026-04-17 16:31:43 +0000 UTC" firstStartedPulling="2026-04-17 16:32:49.299363267 +0000 UTC m=+66.822388832" lastFinishedPulling="2026-04-17 16:32:52.684110788 +0000 UTC m=+70.207136354" observedRunningTime="2026-04-17 16:32:53.357161001 +0000 UTC m=+70.880186586" watchObservedRunningTime="2026-04-17 16:32:53.358957077 +0000 UTC m=+70.881982667" Apr 17 16:32:53.404441 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:53.402864 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vtq9t" podStartSLOduration=68.187841015 podStartE2EDuration="1m10.402846528s" podCreationTimestamp="2026-04-17 16:31:43 +0000 UTC" firstStartedPulling="2026-04-17 16:32:49.113381772 +0000 UTC m=+66.636407335" lastFinishedPulling="2026-04-17 16:32:51.328387284 +0000 UTC m=+68.851412848" observedRunningTime="2026-04-17 16:32:53.402453136 +0000 UTC m=+70.925478717" watchObservedRunningTime="2026-04-17 16:32:53.402846528 +0000 UTC m=+70.925872113" Apr 17 16:32:54.357178 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:54.357140 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" event={"ID":"528a342b-53f1-4c6d-a19d-b68a3684d7d0","Type":"ContainerStarted","Data":"a40fbec0a199c8e6c611548752b8e949be8cb16580c99b77a629887bce67ac59"} Apr 17 16:32:54.357692 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:54.357182 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" event={"ID":"528a342b-53f1-4c6d-a19d-b68a3684d7d0","Type":"ContainerStarted","Data":"c92a8eb10596c5085f769a883816d818a60cf819ddd7489ee55354db93d93f61"} Apr 17 16:32:54.357692 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:54.357198 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" event={"ID":"528a342b-53f1-4c6d-a19d-b68a3684d7d0","Type":"ContainerStarted","Data":"da0032138342f65700c4209a23adb809b1728e2ee2b36cd8a62a2e57271817c9"} Apr 17 16:32:54.357692 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:54.357215 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:32:54.384615 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:54.384547 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" podStartSLOduration=1.774739525 podStartE2EDuration="6.384499762s" podCreationTimestamp="2026-04-17 16:32:48 +0000 UTC" firstStartedPulling="2026-04-17 16:32:49.098942353 +0000 UTC m=+66.621967930" lastFinishedPulling="2026-04-17 16:32:53.708702605 +0000 UTC m=+71.231728167" observedRunningTime="2026-04-17 16:32:54.38302664 +0000 UTC m=+71.906052223" watchObservedRunningTime="2026-04-17 16:32:54.384499762 +0000 UTC m=+71.907525347" Apr 17 16:32:55.362617 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:55.362579 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b146f1f0-bf94-4d1a-9a88-bd721fe8e564","Type":"ContainerStarted","Data":"499c8793a5d3c96d90ad52a711dca99b2cf3725ea86840cdca51d647a845b487"} Apr 17 16:32:56.369187 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:56.368982 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b146f1f0-bf94-4d1a-9a88-bd721fe8e564","Type":"ContainerStarted","Data":"20b9941c161764b17674037e33def2ae342b5078e09edd9547f2aa26b13eb3fe"} Apr 17 16:32:56.369187 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:56.369031 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b146f1f0-bf94-4d1a-9a88-bd721fe8e564","Type":"ContainerStarted","Data":"4b7d62f74de0937a8f11060422041c4eb2e71ef00ab2a8cf76c4c73e1207e3a8"} Apr 17 16:32:56.369187 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:56.369046 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b146f1f0-bf94-4d1a-9a88-bd721fe8e564","Type":"ContainerStarted","Data":"41f110b00fef153e9afbe5c6d7ab9ea95cf85cea2c175ba174887f880e7b59af"} Apr 17 16:32:56.369187 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:56.369058 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b146f1f0-bf94-4d1a-9a88-bd721fe8e564","Type":"ContainerStarted","Data":"9fd357440d632418dcf7abcbd4c14fcd8cd890eaba75e4192821e1629e65d257"} Apr 17 16:32:56.369187 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:56.369070 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b146f1f0-bf94-4d1a-9a88-bd721fe8e564","Type":"ContainerStarted","Data":"d5e875987fd92949174a56fa4f8bf3f9fea401217daf4f3b56dc82cd7b34e763"} Apr 17 16:32:56.370439 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:56.370406 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65dd5ccdb8-mdkmx" event={"ID":"052b105d-312b-43e7-8b4b-e38f8fc75abf","Type":"ContainerStarted","Data":"7aac28cff3e889789c3aa0e8c948d98afc8050a23954c746b3fc56cce1cb223d"} Apr 17 16:32:56.399122 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:56.399054 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=4.167087574 podStartE2EDuration="9.399033446s" podCreationTimestamp="2026-04-17 16:32:47 +0000 UTC" firstStartedPulling="2026-04-17 16:32:49.075627549 +0000 UTC m=+66.598653122" lastFinishedPulling="2026-04-17 16:32:54.307573408 +0000 UTC m=+71.830598994" observedRunningTime="2026-04-17 16:32:56.397532297 +0000 UTC m=+73.920557907" watchObservedRunningTime="2026-04-17 16:32:56.399033446 +0000 UTC m=+73.922059066" Apr 17 16:32:56.424568 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:56.424485 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-65dd5ccdb8-mdkmx" podStartSLOduration=2.51230949 podStartE2EDuration="5.424466161s" podCreationTimestamp="2026-04-17 16:32:51 +0000 UTC" firstStartedPulling="2026-04-17 16:32:52.617977493 +0000 UTC m=+70.141003058" lastFinishedPulling="2026-04-17 16:32:55.530134156 +0000 UTC m=+73.053159729" observedRunningTime="2026-04-17 16:32:56.423051856 +0000 UTC m=+73.946077454" watchObservedRunningTime="2026-04-17 16:32:56.424466161 +0000 UTC m=+73.947491745" Apr 17 16:32:58.380135 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:58.380102 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a6d1365-1408-4246-b172-881b465eddcc","Type":"ContainerStarted","Data":"7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05"} Apr 17 16:32:58.380135 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:58.380138 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a6d1365-1408-4246-b172-881b465eddcc","Type":"ContainerStarted","Data":"32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a"} Apr 17 16:32:58.380638 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:58.380148 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a6d1365-1408-4246-b172-881b465eddcc","Type":"ContainerStarted","Data":"206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8"} Apr 17 16:32:58.380638 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:58.380157 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a6d1365-1408-4246-b172-881b465eddcc","Type":"ContainerStarted","Data":"445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972"} Apr 17 16:32:58.380638 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:58.380165 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a6d1365-1408-4246-b172-881b465eddcc","Type":"ContainerStarted","Data":"48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b"} Apr 17 16:32:58.380638 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:58.380175 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a6d1365-1408-4246-b172-881b465eddcc","Type":"ContainerStarted","Data":"ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5"} Apr 17 16:32:58.410648 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:32:58.410590 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.431947289 podStartE2EDuration="6.410571005s" podCreationTimestamp="2026-04-17 16:32:52 +0000 UTC" firstStartedPulling="2026-04-17 16:32:53.349244825 +0000 UTC m=+70.872270398" lastFinishedPulling="2026-04-17 16:32:57.327868551 +0000 UTC m=+74.850894114" observedRunningTime="2026-04-17 16:32:58.408346088 +0000 UTC m=+75.931371674" watchObservedRunningTime="2026-04-17 16:32:58.410571005 +0000 UTC m=+75.933596591" Apr 17 16:33:00.369013 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:33:00.368986 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-85f4f855f9-2dwwm" Apr 17 16:33:01.468926 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:33:01.468885 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:33:01.468926 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:33:01.468939 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:33:01.475034 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:33:01.475005 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:33:02.396526 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:33:02.396496 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:33:02.911280 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:33:02.911213 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:24.360926 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:33:24.360892 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-hdwf7" Apr 17 16:33:52.911777 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:33:52.911739 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:52.931046 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:33:52.931020 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:33:53.562805 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:33:53.562774 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:06.722514 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:06.722470 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:34:06.722993 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:06.722908 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerName="alertmanager" containerID="cri-o://499c8793a5d3c96d90ad52a711dca99b2cf3725ea86840cdca51d647a845b487" gracePeriod=120 Apr 17 16:34:06.722993 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:06.722974 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerName="kube-rbac-proxy-web" containerID="cri-o://9fd357440d632418dcf7abcbd4c14fcd8cd890eaba75e4192821e1629e65d257" gracePeriod=120 Apr 17 16:34:06.723123 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:06.723003 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerName="config-reloader" containerID="cri-o://d5e875987fd92949174a56fa4f8bf3f9fea401217daf4f3b56dc82cd7b34e763" gracePeriod=120 Apr 17 16:34:06.723123 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:06.723035 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerName="kube-rbac-proxy" containerID="cri-o://41f110b00fef153e9afbe5c6d7ab9ea95cf85cea2c175ba174887f880e7b59af" gracePeriod=120 Apr 17 16:34:06.723123 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:06.722958 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerName="kube-rbac-proxy-metric" containerID="cri-o://4b7d62f74de0937a8f11060422041c4eb2e71ef00ab2a8cf76c4c73e1207e3a8" gracePeriod=120 Apr 17 16:34:06.723123 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:06.723011 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerName="prom-label-proxy" containerID="cri-o://20b9941c161764b17674037e33def2ae342b5078e09edd9547f2aa26b13eb3fe" gracePeriod=120 Apr 17 16:34:07.593819 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:07.593785 2569 generic.go:358] "Generic (PLEG): container finished" podID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerID="20b9941c161764b17674037e33def2ae342b5078e09edd9547f2aa26b13eb3fe" exitCode=0 Apr 17 16:34:07.593819 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:07.593811 2569 generic.go:358] "Generic (PLEG): container finished" podID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerID="41f110b00fef153e9afbe5c6d7ab9ea95cf85cea2c175ba174887f880e7b59af" exitCode=0 Apr 17 16:34:07.593819 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:07.593818 2569 generic.go:358] "Generic (PLEG): container finished" podID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerID="d5e875987fd92949174a56fa4f8bf3f9fea401217daf4f3b56dc82cd7b34e763" exitCode=0 Apr 17 16:34:07.593819 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:07.593824 2569 generic.go:358] "Generic (PLEG): container finished" podID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerID="499c8793a5d3c96d90ad52a711dca99b2cf3725ea86840cdca51d647a845b487" exitCode=0 Apr 17 16:34:07.594079 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:07.593854 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b146f1f0-bf94-4d1a-9a88-bd721fe8e564","Type":"ContainerDied","Data":"20b9941c161764b17674037e33def2ae342b5078e09edd9547f2aa26b13eb3fe"} Apr 17 16:34:07.594079 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:07.593885 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b146f1f0-bf94-4d1a-9a88-bd721fe8e564","Type":"ContainerDied","Data":"41f110b00fef153e9afbe5c6d7ab9ea95cf85cea2c175ba174887f880e7b59af"} Apr 17 16:34:07.594079 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:07.593897 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b146f1f0-bf94-4d1a-9a88-bd721fe8e564","Type":"ContainerDied","Data":"d5e875987fd92949174a56fa4f8bf3f9fea401217daf4f3b56dc82cd7b34e763"} Apr 17 16:34:07.594079 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:07.593906 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b146f1f0-bf94-4d1a-9a88-bd721fe8e564","Type":"ContainerDied","Data":"499c8793a5d3c96d90ad52a711dca99b2cf3725ea86840cdca51d647a845b487"} Apr 17 16:34:07.959557 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:07.959529 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.102097 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.102048 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-alertmanager-trusted-ca-bundle\") pod \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " Apr 17 16:34:08.102303 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.102125 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-metrics-client-ca\") pod \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " Apr 17 16:34:08.102303 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.102151 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-cluster-tls-config\") pod \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " Apr 17 16:34:08.102303 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.102178 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-config-out\") pod \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " Apr 17 16:34:08.102303 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.102200 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-web-config\") pod \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " Apr 17 16:34:08.102303 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.102238 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-secret-alertmanager-kube-rbac-proxy-web\") pod \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " Apr 17 16:34:08.102510 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.102307 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-secret-alertmanager-main-tls\") pod \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " Apr 17 16:34:08.102510 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.102332 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-alertmanager-main-db\") pod \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " Apr 17 16:34:08.102510 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.102362 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-tls-assets\") pod \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " Apr 17 16:34:08.102658 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.102585 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "b146f1f0-bf94-4d1a-9a88-bd721fe8e564" (UID: "b146f1f0-bf94-4d1a-9a88-bd721fe8e564"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:08.102882 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.102853 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "b146f1f0-bf94-4d1a-9a88-bd721fe8e564" (UID: "b146f1f0-bf94-4d1a-9a88-bd721fe8e564"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:08.103217 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.102954 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "b146f1f0-bf94-4d1a-9a88-bd721fe8e564" (UID: "b146f1f0-bf94-4d1a-9a88-bd721fe8e564"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:34:08.103217 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.103010 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-secret-alertmanager-kube-rbac-proxy-metric\") pod \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " Apr 17 16:34:08.103217 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.103147 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-secret-alertmanager-kube-rbac-proxy\") pod \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " Apr 17 16:34:08.103217 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.103195 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-config-volume\") pod \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " Apr 17 16:34:08.103464 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.103221 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68d65\" (UniqueName: \"kubernetes.io/projected/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-kube-api-access-68d65\") pod \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\" (UID: \"b146f1f0-bf94-4d1a-9a88-bd721fe8e564\") " Apr 17 16:34:08.103519 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.103501 2569 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:08.103573 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.103521 2569 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-metrics-client-ca\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:08.103573 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.103537 2569 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-alertmanager-main-db\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:08.105170 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.105140 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "b146f1f0-bf94-4d1a-9a88-bd721fe8e564" (UID: "b146f1f0-bf94-4d1a-9a88-bd721fe8e564"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:08.105744 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.105499 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "b146f1f0-bf94-4d1a-9a88-bd721fe8e564" (UID: "b146f1f0-bf94-4d1a-9a88-bd721fe8e564"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:08.105744 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.105540 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "b146f1f0-bf94-4d1a-9a88-bd721fe8e564" (UID: "b146f1f0-bf94-4d1a-9a88-bd721fe8e564"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:34:08.105904 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.105764 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-kube-api-access-68d65" (OuterVolumeSpecName: "kube-api-access-68d65") pod "b146f1f0-bf94-4d1a-9a88-bd721fe8e564" (UID: "b146f1f0-bf94-4d1a-9a88-bd721fe8e564"). InnerVolumeSpecName "kube-api-access-68d65". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:34:08.105904 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.105789 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-config-out" (OuterVolumeSpecName: "config-out") pod "b146f1f0-bf94-4d1a-9a88-bd721fe8e564" (UID: "b146f1f0-bf94-4d1a-9a88-bd721fe8e564"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:34:08.105904 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.105795 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "b146f1f0-bf94-4d1a-9a88-bd721fe8e564" (UID: "b146f1f0-bf94-4d1a-9a88-bd721fe8e564"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:08.107206 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.107185 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-config-volume" (OuterVolumeSpecName: "config-volume") pod "b146f1f0-bf94-4d1a-9a88-bd721fe8e564" (UID: "b146f1f0-bf94-4d1a-9a88-bd721fe8e564"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:08.107702 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.107684 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "b146f1f0-bf94-4d1a-9a88-bd721fe8e564" (UID: "b146f1f0-bf94-4d1a-9a88-bd721fe8e564"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:08.109637 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.109610 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "b146f1f0-bf94-4d1a-9a88-bd721fe8e564" (UID: "b146f1f0-bf94-4d1a-9a88-bd721fe8e564"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:08.115730 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.115705 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-web-config" (OuterVolumeSpecName: "web-config") pod "b146f1f0-bf94-4d1a-9a88-bd721fe8e564" (UID: "b146f1f0-bf94-4d1a-9a88-bd721fe8e564"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:08.204200 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.204116 2569 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-cluster-tls-config\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:08.204200 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.204150 2569 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-config-out\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:08.204200 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.204165 2569 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-web-config\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:08.204200 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.204179 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:08.204200 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.204191 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-secret-alertmanager-main-tls\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:08.204200 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.204200 2569 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-tls-assets\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:08.204486 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.204210 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:08.204486 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.204220 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:08.204486 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.204234 2569 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-config-volume\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:08.204486 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.204246 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-68d65\" (UniqueName: \"kubernetes.io/projected/b146f1f0-bf94-4d1a-9a88-bd721fe8e564-kube-api-access-68d65\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:08.600281 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.600183 2569 generic.go:358] "Generic (PLEG): container finished" podID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerID="4b7d62f74de0937a8f11060422041c4eb2e71ef00ab2a8cf76c4c73e1207e3a8" exitCode=0 Apr 17 16:34:08.600281 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.600208 2569 generic.go:358] "Generic (PLEG): container finished" podID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerID="9fd357440d632418dcf7abcbd4c14fcd8cd890eaba75e4192821e1629e65d257" exitCode=0 Apr 17 16:34:08.600461 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.600282 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b146f1f0-bf94-4d1a-9a88-bd721fe8e564","Type":"ContainerDied","Data":"4b7d62f74de0937a8f11060422041c4eb2e71ef00ab2a8cf76c4c73e1207e3a8"} Apr 17 16:34:08.600461 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.600319 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b146f1f0-bf94-4d1a-9a88-bd721fe8e564","Type":"ContainerDied","Data":"9fd357440d632418dcf7abcbd4c14fcd8cd890eaba75e4192821e1629e65d257"} Apr 17 16:34:08.600461 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.600332 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b146f1f0-bf94-4d1a-9a88-bd721fe8e564","Type":"ContainerDied","Data":"0857aa699aed96557e009c6c980c943006b38ec6d5f80e46a6cefb9a139c9d6e"} Apr 17 16:34:08.600461 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.600346 2569 scope.go:117] "RemoveContainer" containerID="20b9941c161764b17674037e33def2ae342b5078e09edd9547f2aa26b13eb3fe" Apr 17 16:34:08.600461 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.600347 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.608294 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.608270 2569 scope.go:117] "RemoveContainer" containerID="4b7d62f74de0937a8f11060422041c4eb2e71ef00ab2a8cf76c4c73e1207e3a8" Apr 17 16:34:08.615122 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.615103 2569 scope.go:117] "RemoveContainer" containerID="41f110b00fef153e9afbe5c6d7ab9ea95cf85cea2c175ba174887f880e7b59af" Apr 17 16:34:08.621492 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.621474 2569 scope.go:117] "RemoveContainer" containerID="9fd357440d632418dcf7abcbd4c14fcd8cd890eaba75e4192821e1629e65d257" Apr 17 16:34:08.624385 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.624363 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:34:08.630201 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.630155 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:34:08.630292 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.630271 2569 scope.go:117] "RemoveContainer" containerID="d5e875987fd92949174a56fa4f8bf3f9fea401217daf4f3b56dc82cd7b34e763" Apr 17 16:34:08.636543 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.636526 2569 scope.go:117] "RemoveContainer" containerID="499c8793a5d3c96d90ad52a711dca99b2cf3725ea86840cdca51d647a845b487" Apr 17 16:34:08.643224 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.643204 2569 scope.go:117] "RemoveContainer" containerID="557ab999bed69c5d82eb449e0414483b1e79a53a01c3e383f9dad5750397736f" Apr 17 16:34:08.649734 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.649716 2569 scope.go:117] "RemoveContainer" containerID="20b9941c161764b17674037e33def2ae342b5078e09edd9547f2aa26b13eb3fe" Apr 17 16:34:08.650007 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:34:08.649977 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20b9941c161764b17674037e33def2ae342b5078e09edd9547f2aa26b13eb3fe\": container with ID starting with 20b9941c161764b17674037e33def2ae342b5078e09edd9547f2aa26b13eb3fe not found: ID does not exist" containerID="20b9941c161764b17674037e33def2ae342b5078e09edd9547f2aa26b13eb3fe" Apr 17 16:34:08.650095 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.650005 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20b9941c161764b17674037e33def2ae342b5078e09edd9547f2aa26b13eb3fe"} err="failed to get container status \"20b9941c161764b17674037e33def2ae342b5078e09edd9547f2aa26b13eb3fe\": rpc error: code = NotFound desc = could not find container \"20b9941c161764b17674037e33def2ae342b5078e09edd9547f2aa26b13eb3fe\": container with ID starting with 20b9941c161764b17674037e33def2ae342b5078e09edd9547f2aa26b13eb3fe not found: ID does not exist" Apr 17 16:34:08.650095 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.650040 2569 scope.go:117] "RemoveContainer" containerID="4b7d62f74de0937a8f11060422041c4eb2e71ef00ab2a8cf76c4c73e1207e3a8" Apr 17 16:34:08.650307 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:34:08.650288 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b7d62f74de0937a8f11060422041c4eb2e71ef00ab2a8cf76c4c73e1207e3a8\": container with ID starting with 4b7d62f74de0937a8f11060422041c4eb2e71ef00ab2a8cf76c4c73e1207e3a8 not found: ID does not exist" containerID="4b7d62f74de0937a8f11060422041c4eb2e71ef00ab2a8cf76c4c73e1207e3a8" Apr 17 16:34:08.650366 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.650316 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b7d62f74de0937a8f11060422041c4eb2e71ef00ab2a8cf76c4c73e1207e3a8"} err="failed to get container status \"4b7d62f74de0937a8f11060422041c4eb2e71ef00ab2a8cf76c4c73e1207e3a8\": rpc error: code = NotFound desc = could not find container \"4b7d62f74de0937a8f11060422041c4eb2e71ef00ab2a8cf76c4c73e1207e3a8\": container with ID starting with 4b7d62f74de0937a8f11060422041c4eb2e71ef00ab2a8cf76c4c73e1207e3a8 not found: ID does not exist" Apr 17 16:34:08.650366 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.650338 2569 scope.go:117] "RemoveContainer" containerID="41f110b00fef153e9afbe5c6d7ab9ea95cf85cea2c175ba174887f880e7b59af" Apr 17 16:34:08.650560 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:34:08.650545 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41f110b00fef153e9afbe5c6d7ab9ea95cf85cea2c175ba174887f880e7b59af\": container with ID starting with 41f110b00fef153e9afbe5c6d7ab9ea95cf85cea2c175ba174887f880e7b59af not found: ID does not exist" containerID="41f110b00fef153e9afbe5c6d7ab9ea95cf85cea2c175ba174887f880e7b59af" Apr 17 16:34:08.650598 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.650564 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41f110b00fef153e9afbe5c6d7ab9ea95cf85cea2c175ba174887f880e7b59af"} err="failed to get container status \"41f110b00fef153e9afbe5c6d7ab9ea95cf85cea2c175ba174887f880e7b59af\": rpc error: code = NotFound desc = could not find container \"41f110b00fef153e9afbe5c6d7ab9ea95cf85cea2c175ba174887f880e7b59af\": container with ID starting with 41f110b00fef153e9afbe5c6d7ab9ea95cf85cea2c175ba174887f880e7b59af not found: ID does not exist" Apr 17 16:34:08.650598 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.650577 2569 scope.go:117] "RemoveContainer" containerID="9fd357440d632418dcf7abcbd4c14fcd8cd890eaba75e4192821e1629e65d257" Apr 17 16:34:08.650818 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:34:08.650801 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fd357440d632418dcf7abcbd4c14fcd8cd890eaba75e4192821e1629e65d257\": container with ID starting with 9fd357440d632418dcf7abcbd4c14fcd8cd890eaba75e4192821e1629e65d257 not found: ID does not exist" containerID="9fd357440d632418dcf7abcbd4c14fcd8cd890eaba75e4192821e1629e65d257" Apr 17 16:34:08.650862 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.650823 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fd357440d632418dcf7abcbd4c14fcd8cd890eaba75e4192821e1629e65d257"} err="failed to get container status \"9fd357440d632418dcf7abcbd4c14fcd8cd890eaba75e4192821e1629e65d257\": rpc error: code = NotFound desc = could not find container \"9fd357440d632418dcf7abcbd4c14fcd8cd890eaba75e4192821e1629e65d257\": container with ID starting with 9fd357440d632418dcf7abcbd4c14fcd8cd890eaba75e4192821e1629e65d257 not found: ID does not exist" Apr 17 16:34:08.650862 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.650838 2569 scope.go:117] "RemoveContainer" containerID="d5e875987fd92949174a56fa4f8bf3f9fea401217daf4f3b56dc82cd7b34e763" Apr 17 16:34:08.651069 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:34:08.651051 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5e875987fd92949174a56fa4f8bf3f9fea401217daf4f3b56dc82cd7b34e763\": container with ID starting with d5e875987fd92949174a56fa4f8bf3f9fea401217daf4f3b56dc82cd7b34e763 not found: ID does not exist" containerID="d5e875987fd92949174a56fa4f8bf3f9fea401217daf4f3b56dc82cd7b34e763" Apr 17 16:34:08.651110 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.651074 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5e875987fd92949174a56fa4f8bf3f9fea401217daf4f3b56dc82cd7b34e763"} err="failed to get container status \"d5e875987fd92949174a56fa4f8bf3f9fea401217daf4f3b56dc82cd7b34e763\": rpc error: code = NotFound desc = could not find container \"d5e875987fd92949174a56fa4f8bf3f9fea401217daf4f3b56dc82cd7b34e763\": container with ID starting with d5e875987fd92949174a56fa4f8bf3f9fea401217daf4f3b56dc82cd7b34e763 not found: ID does not exist" Apr 17 16:34:08.651110 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.651093 2569 scope.go:117] "RemoveContainer" containerID="499c8793a5d3c96d90ad52a711dca99b2cf3725ea86840cdca51d647a845b487" Apr 17 16:34:08.651337 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:34:08.651308 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"499c8793a5d3c96d90ad52a711dca99b2cf3725ea86840cdca51d647a845b487\": container with ID starting with 499c8793a5d3c96d90ad52a711dca99b2cf3725ea86840cdca51d647a845b487 not found: ID does not exist" containerID="499c8793a5d3c96d90ad52a711dca99b2cf3725ea86840cdca51d647a845b487" Apr 17 16:34:08.651419 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.651333 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"499c8793a5d3c96d90ad52a711dca99b2cf3725ea86840cdca51d647a845b487"} err="failed to get container status \"499c8793a5d3c96d90ad52a711dca99b2cf3725ea86840cdca51d647a845b487\": rpc error: code = NotFound desc = could not find container \"499c8793a5d3c96d90ad52a711dca99b2cf3725ea86840cdca51d647a845b487\": container with ID starting with 499c8793a5d3c96d90ad52a711dca99b2cf3725ea86840cdca51d647a845b487 not found: ID does not exist" Apr 17 16:34:08.651419 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.651348 2569 scope.go:117] "RemoveContainer" containerID="557ab999bed69c5d82eb449e0414483b1e79a53a01c3e383f9dad5750397736f" Apr 17 16:34:08.651569 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:34:08.651554 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"557ab999bed69c5d82eb449e0414483b1e79a53a01c3e383f9dad5750397736f\": container with ID starting with 557ab999bed69c5d82eb449e0414483b1e79a53a01c3e383f9dad5750397736f not found: ID does not exist" containerID="557ab999bed69c5d82eb449e0414483b1e79a53a01c3e383f9dad5750397736f" Apr 17 16:34:08.651621 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.651577 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"557ab999bed69c5d82eb449e0414483b1e79a53a01c3e383f9dad5750397736f"} err="failed to get container status \"557ab999bed69c5d82eb449e0414483b1e79a53a01c3e383f9dad5750397736f\": rpc error: code = NotFound desc = could not find container \"557ab999bed69c5d82eb449e0414483b1e79a53a01c3e383f9dad5750397736f\": container with ID starting with 557ab999bed69c5d82eb449e0414483b1e79a53a01c3e383f9dad5750397736f not found: ID does not exist" Apr 17 16:34:08.651621 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.651593 2569 scope.go:117] "RemoveContainer" containerID="20b9941c161764b17674037e33def2ae342b5078e09edd9547f2aa26b13eb3fe" Apr 17 16:34:08.651848 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.651828 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20b9941c161764b17674037e33def2ae342b5078e09edd9547f2aa26b13eb3fe"} err="failed to get container status \"20b9941c161764b17674037e33def2ae342b5078e09edd9547f2aa26b13eb3fe\": rpc error: code = NotFound desc = could not find container \"20b9941c161764b17674037e33def2ae342b5078e09edd9547f2aa26b13eb3fe\": container with ID starting with 20b9941c161764b17674037e33def2ae342b5078e09edd9547f2aa26b13eb3fe not found: ID does not exist" Apr 17 16:34:08.651915 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.651849 2569 scope.go:117] "RemoveContainer" containerID="4b7d62f74de0937a8f11060422041c4eb2e71ef00ab2a8cf76c4c73e1207e3a8" Apr 17 16:34:08.652076 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.652056 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b7d62f74de0937a8f11060422041c4eb2e71ef00ab2a8cf76c4c73e1207e3a8"} err="failed to get container status \"4b7d62f74de0937a8f11060422041c4eb2e71ef00ab2a8cf76c4c73e1207e3a8\": rpc error: code = NotFound desc = could not find container \"4b7d62f74de0937a8f11060422041c4eb2e71ef00ab2a8cf76c4c73e1207e3a8\": container with ID starting with 4b7d62f74de0937a8f11060422041c4eb2e71ef00ab2a8cf76c4c73e1207e3a8 not found: ID does not exist" Apr 17 16:34:08.652140 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.652076 2569 scope.go:117] "RemoveContainer" containerID="41f110b00fef153e9afbe5c6d7ab9ea95cf85cea2c175ba174887f880e7b59af" Apr 17 16:34:08.652308 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.652285 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41f110b00fef153e9afbe5c6d7ab9ea95cf85cea2c175ba174887f880e7b59af"} err="failed to get container status \"41f110b00fef153e9afbe5c6d7ab9ea95cf85cea2c175ba174887f880e7b59af\": rpc error: code = NotFound desc = could not find container \"41f110b00fef153e9afbe5c6d7ab9ea95cf85cea2c175ba174887f880e7b59af\": container with ID starting with 41f110b00fef153e9afbe5c6d7ab9ea95cf85cea2c175ba174887f880e7b59af not found: ID does not exist" Apr 17 16:34:08.652367 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.652310 2569 scope.go:117] "RemoveContainer" containerID="9fd357440d632418dcf7abcbd4c14fcd8cd890eaba75e4192821e1629e65d257" Apr 17 16:34:08.652531 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.652512 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fd357440d632418dcf7abcbd4c14fcd8cd890eaba75e4192821e1629e65d257"} err="failed to get container status \"9fd357440d632418dcf7abcbd4c14fcd8cd890eaba75e4192821e1629e65d257\": rpc error: code = NotFound desc = could not find container \"9fd357440d632418dcf7abcbd4c14fcd8cd890eaba75e4192821e1629e65d257\": container with ID starting with 9fd357440d632418dcf7abcbd4c14fcd8cd890eaba75e4192821e1629e65d257 not found: ID does not exist" Apr 17 16:34:08.652599 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.652531 2569 scope.go:117] "RemoveContainer" containerID="d5e875987fd92949174a56fa4f8bf3f9fea401217daf4f3b56dc82cd7b34e763" Apr 17 16:34:08.652728 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.652707 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5e875987fd92949174a56fa4f8bf3f9fea401217daf4f3b56dc82cd7b34e763"} err="failed to get container status \"d5e875987fd92949174a56fa4f8bf3f9fea401217daf4f3b56dc82cd7b34e763\": rpc error: code = NotFound desc = could not find container \"d5e875987fd92949174a56fa4f8bf3f9fea401217daf4f3b56dc82cd7b34e763\": container with ID starting with d5e875987fd92949174a56fa4f8bf3f9fea401217daf4f3b56dc82cd7b34e763 not found: ID does not exist" Apr 17 16:34:08.652789 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.652729 2569 scope.go:117] "RemoveContainer" containerID="499c8793a5d3c96d90ad52a711dca99b2cf3725ea86840cdca51d647a845b487" Apr 17 16:34:08.652950 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.652927 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"499c8793a5d3c96d90ad52a711dca99b2cf3725ea86840cdca51d647a845b487"} err="failed to get container status \"499c8793a5d3c96d90ad52a711dca99b2cf3725ea86840cdca51d647a845b487\": rpc error: code = NotFound desc = could not find container \"499c8793a5d3c96d90ad52a711dca99b2cf3725ea86840cdca51d647a845b487\": container with ID starting with 499c8793a5d3c96d90ad52a711dca99b2cf3725ea86840cdca51d647a845b487 not found: ID does not exist" Apr 17 16:34:08.653012 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.652952 2569 scope.go:117] "RemoveContainer" containerID="557ab999bed69c5d82eb449e0414483b1e79a53a01c3e383f9dad5750397736f" Apr 17 16:34:08.653155 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.653138 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"557ab999bed69c5d82eb449e0414483b1e79a53a01c3e383f9dad5750397736f"} err="failed to get container status \"557ab999bed69c5d82eb449e0414483b1e79a53a01c3e383f9dad5750397736f\": rpc error: code = NotFound desc = could not find container \"557ab999bed69c5d82eb449e0414483b1e79a53a01c3e383f9dad5750397736f\": container with ID starting with 557ab999bed69c5d82eb449e0414483b1e79a53a01c3e383f9dad5750397736f not found: ID does not exist" Apr 17 16:34:08.657987 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.657575 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:34:08.658115 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.658016 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerName="alertmanager" Apr 17 16:34:08.658115 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.658046 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerName="alertmanager" Apr 17 16:34:08.658115 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.658061 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerName="kube-rbac-proxy" Apr 17 16:34:08.658115 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.658070 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerName="kube-rbac-proxy" Apr 17 16:34:08.658115 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.658087 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerName="init-config-reloader" Apr 17 16:34:08.658115 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.658095 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerName="init-config-reloader" Apr 17 16:34:08.658115 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.658105 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerName="kube-rbac-proxy-metric" Apr 17 16:34:08.658115 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.658113 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerName="kube-rbac-proxy-metric" Apr 17 16:34:08.658526 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.658125 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerName="kube-rbac-proxy-web" Apr 17 16:34:08.658526 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.658133 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerName="kube-rbac-proxy-web" Apr 17 16:34:08.658526 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.658143 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerName="prom-label-proxy" Apr 17 16:34:08.658526 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.658153 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerName="prom-label-proxy" Apr 17 16:34:08.658526 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.658163 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerName="config-reloader" Apr 17 16:34:08.658526 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.658172 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerName="config-reloader" Apr 17 16:34:08.658526 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.658244 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerName="config-reloader" Apr 17 16:34:08.658526 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.658284 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerName="kube-rbac-proxy" Apr 17 16:34:08.658526 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.658297 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerName="alertmanager" Apr 17 16:34:08.658526 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.658307 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerName="kube-rbac-proxy-web" Apr 17 16:34:08.658526 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.658317 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerName="prom-label-proxy" Apr 17 16:34:08.658526 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.658329 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" containerName="kube-rbac-proxy-metric" Apr 17 16:34:08.663524 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.663501 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.666226 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.666199 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 16:34:08.666354 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.666244 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 16:34:08.666354 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.666312 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 16:34:08.666354 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.666199 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 16:34:08.666528 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.666438 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 16:34:08.666528 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.666522 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-46tcd\"" Apr 17 16:34:08.666731 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.666714 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 16:34:08.666782 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.666724 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 16:34:08.666782 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.666731 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 16:34:08.673626 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.673607 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 16:34:08.674871 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.674849 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:34:08.807757 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.807723 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-config-out\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.807902 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.807762 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.807902 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.807783 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.808012 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.807893 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-web-config\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.808012 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.807943 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-config-volume\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.808012 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.807971 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.808012 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.808003 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.808187 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.808062 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.808187 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.808097 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.808187 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.808144 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.808316 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.808192 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97wzh\" (UniqueName: \"kubernetes.io/projected/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-kube-api-access-97wzh\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.808316 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.808229 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.808316 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.808272 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.909436 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.909396 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97wzh\" (UniqueName: \"kubernetes.io/projected/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-kube-api-access-97wzh\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.909436 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.909440 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.909640 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.909460 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.909640 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.909484 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-config-out\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.909640 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.909608 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.909750 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.909645 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.909750 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.909686 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-web-config\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.909750 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.909724 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-config-volume\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.909898 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.909818 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.909898 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.909864 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.909991 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.909913 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.909991 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.909952 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.910088 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.910007 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.910698 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.910610 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.911971 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.911941 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.912553 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.912524 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.912681 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.912659 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-config-out\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.912720 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.912707 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-config-volume\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.912964 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.912944 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.913377 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.913359 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.913452 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.913402 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.913452 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.913435 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.913525 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.913461 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.914113 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.914093 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-web-config\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.914381 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.914363 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.921951 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.921921 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97wzh\" (UniqueName: \"kubernetes.io/projected/7cf048f7-68e4-4bc9-beac-730ca8f13ceb-kube-api-access-97wzh\") pod \"alertmanager-main-0\" (UID: \"7cf048f7-68e4-4bc9-beac-730ca8f13ceb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:08.973656 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:08.973619 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:34:09.041390 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:09.041358 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b146f1f0-bf94-4d1a-9a88-bd721fe8e564" path="/var/lib/kubelet/pods/b146f1f0-bf94-4d1a-9a88-bd721fe8e564/volumes" Apr 17 16:34:09.119038 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:09.118967 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:34:09.123540 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:34:09.123512 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cf048f7_68e4_4bc9_beac_730ca8f13ceb.slice/crio-b79dbafa4e96c62bbe66e0fd6ff28044135610a45e034fed7a79e84f17a6426d WatchSource:0}: Error finding container b79dbafa4e96c62bbe66e0fd6ff28044135610a45e034fed7a79e84f17a6426d: Status 404 returned error can't find the container with id b79dbafa4e96c62bbe66e0fd6ff28044135610a45e034fed7a79e84f17a6426d Apr 17 16:34:09.604526 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:09.604491 2569 generic.go:358] "Generic (PLEG): container finished" podID="7cf048f7-68e4-4bc9-beac-730ca8f13ceb" containerID="341e366ea007df26c9f61bdb020985521c57df8f3411c82683df3e05f36fde69" exitCode=0 Apr 17 16:34:09.604701 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:09.604590 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7cf048f7-68e4-4bc9-beac-730ca8f13ceb","Type":"ContainerDied","Data":"341e366ea007df26c9f61bdb020985521c57df8f3411c82683df3e05f36fde69"} Apr 17 16:34:09.604701 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:09.604635 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7cf048f7-68e4-4bc9-beac-730ca8f13ceb","Type":"ContainerStarted","Data":"b79dbafa4e96c62bbe66e0fd6ff28044135610a45e034fed7a79e84f17a6426d"} Apr 17 16:34:10.612552 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:10.612519 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7cf048f7-68e4-4bc9-beac-730ca8f13ceb","Type":"ContainerStarted","Data":"b472706d03068d46f85565a39e89cc9607a6cdf70d6d976fa6d2da103bccb509"} Apr 17 16:34:10.612552 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:10.612554 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7cf048f7-68e4-4bc9-beac-730ca8f13ceb","Type":"ContainerStarted","Data":"50ec6d6134054449278372f630d74c9b6c760664e21ae1612fad3e46e1d55b43"} Apr 17 16:34:10.612967 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:10.612564 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7cf048f7-68e4-4bc9-beac-730ca8f13ceb","Type":"ContainerStarted","Data":"f82ba0000f61d247466ee90639fd350de411d6396db3fb324f832c42dce4aaa7"} Apr 17 16:34:10.612967 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:10.612573 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7cf048f7-68e4-4bc9-beac-730ca8f13ceb","Type":"ContainerStarted","Data":"5d2f12d77cf763592a700a57f9e4e0bb7b62cf933e24614707668f8a386a772d"} Apr 17 16:34:10.612967 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:10.612581 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7cf048f7-68e4-4bc9-beac-730ca8f13ceb","Type":"ContainerStarted","Data":"fa4945d1ebd019cc70d2832aadfee65e8d4a56e1784686546b405c4bd32d9be7"} Apr 17 16:34:10.612967 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:10.612590 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7cf048f7-68e4-4bc9-beac-730ca8f13ceb","Type":"ContainerStarted","Data":"4993e9ad50e5e7863293564f426f733c13bad8cd4606eb589862cb1fe3415154"} Apr 17 16:34:10.650177 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:10.650115 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.65009498 podStartE2EDuration="2.65009498s" podCreationTimestamp="2026-04-17 16:34:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:34:10.648154206 +0000 UTC m=+148.171179790" watchObservedRunningTime="2026-04-17 16:34:10.65009498 +0000 UTC m=+148.173120563" Apr 17 16:34:11.049788 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.049705 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:34:11.050356 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.050177 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1a6d1365-1408-4246-b172-881b465eddcc" containerName="prometheus" containerID="cri-o://ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5" gracePeriod=600 Apr 17 16:34:11.050356 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.050201 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1a6d1365-1408-4246-b172-881b465eddcc" containerName="kube-rbac-proxy-thanos" containerID="cri-o://7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05" gracePeriod=600 Apr 17 16:34:11.050356 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.050222 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1a6d1365-1408-4246-b172-881b465eddcc" containerName="kube-rbac-proxy-web" containerID="cri-o://206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8" gracePeriod=600 Apr 17 16:34:11.050356 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.050231 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1a6d1365-1408-4246-b172-881b465eddcc" containerName="thanos-sidecar" containerID="cri-o://445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972" gracePeriod=600 Apr 17 16:34:11.050356 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.050183 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1a6d1365-1408-4246-b172-881b465eddcc" containerName="kube-rbac-proxy" containerID="cri-o://32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a" gracePeriod=600 Apr 17 16:34:11.050685 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.050352 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1a6d1365-1408-4246-b172-881b465eddcc" containerName="config-reloader" containerID="cri-o://48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b" gracePeriod=600 Apr 17 16:34:11.297729 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.297622 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.434160 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.434113 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-prometheus-k8s-rulefiles-0\") pod \"1a6d1365-1408-4246-b172-881b465eddcc\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " Apr 17 16:34:11.434160 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.434165 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a6d1365-1408-4246-b172-881b465eddcc-tls-assets\") pod \"1a6d1365-1408-4246-b172-881b465eddcc\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " Apr 17 16:34:11.434423 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.434193 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"1a6d1365-1408-4246-b172-881b465eddcc\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " Apr 17 16:34:11.434423 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.434216 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-grpc-tls\") pod \"1a6d1365-1408-4246-b172-881b465eddcc\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " Apr 17 16:34:11.434423 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.434237 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-prometheus-trusted-ca-bundle\") pod \"1a6d1365-1408-4246-b172-881b465eddcc\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " Apr 17 16:34:11.434423 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.434277 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-thanos-prometheus-http-client-file\") pod \"1a6d1365-1408-4246-b172-881b465eddcc\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " Apr 17 16:34:11.434423 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.434301 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a6d1365-1408-4246-b172-881b465eddcc-config-out\") pod \"1a6d1365-1408-4246-b172-881b465eddcc\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " Apr 17 16:34:11.434423 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.434324 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1a6d1365-1408-4246-b172-881b465eddcc-prometheus-k8s-db\") pod \"1a6d1365-1408-4246-b172-881b465eddcc\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " Apr 17 16:34:11.434423 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.434365 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-configmap-kubelet-serving-ca-bundle\") pod \"1a6d1365-1408-4246-b172-881b465eddcc\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " Apr 17 16:34:11.434423 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.434391 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-configmap-serving-certs-ca-bundle\") pod \"1a6d1365-1408-4246-b172-881b465eddcc\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " Apr 17 16:34:11.434812 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.434436 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-prometheus-k8s-tls\") pod \"1a6d1365-1408-4246-b172-881b465eddcc\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " Apr 17 16:34:11.434812 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.434482 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-metrics-client-certs\") pod \"1a6d1365-1408-4246-b172-881b465eddcc\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " Apr 17 16:34:11.434812 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.434509 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mgkn\" (UniqueName: \"kubernetes.io/projected/1a6d1365-1408-4246-b172-881b465eddcc-kube-api-access-4mgkn\") pod \"1a6d1365-1408-4246-b172-881b465eddcc\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " Apr 17 16:34:11.434812 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.434541 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-web-config\") pod \"1a6d1365-1408-4246-b172-881b465eddcc\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " Apr 17 16:34:11.434812 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.434573 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-configmap-metrics-client-ca\") pod \"1a6d1365-1408-4246-b172-881b465eddcc\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " Apr 17 16:34:11.434812 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.434603 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-config\") pod \"1a6d1365-1408-4246-b172-881b465eddcc\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " Apr 17 16:34:11.434812 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.434646 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"1a6d1365-1408-4246-b172-881b465eddcc\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " Apr 17 16:34:11.434812 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.434686 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-kube-rbac-proxy\") pod \"1a6d1365-1408-4246-b172-881b465eddcc\" (UID: \"1a6d1365-1408-4246-b172-881b465eddcc\") " Apr 17 16:34:11.434812 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.434684 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "1a6d1365-1408-4246-b172-881b465eddcc" (UID: "1a6d1365-1408-4246-b172-881b465eddcc"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:11.435327 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.434946 2569 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-prometheus-trusted-ca-bundle\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:11.435383 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.435332 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "1a6d1365-1408-4246-b172-881b465eddcc" (UID: "1a6d1365-1408-4246-b172-881b465eddcc"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:11.435979 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.435687 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a6d1365-1408-4246-b172-881b465eddcc-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "1a6d1365-1408-4246-b172-881b465eddcc" (UID: "1a6d1365-1408-4246-b172-881b465eddcc"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:34:11.437004 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.436695 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "1a6d1365-1408-4246-b172-881b465eddcc" (UID: "1a6d1365-1408-4246-b172-881b465eddcc"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:11.437004 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.436716 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "1a6d1365-1408-4246-b172-881b465eddcc" (UID: "1a6d1365-1408-4246-b172-881b465eddcc"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:11.437165 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.437005 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "1a6d1365-1408-4246-b172-881b465eddcc" (UID: "1a6d1365-1408-4246-b172-881b465eddcc"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:11.437165 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.437125 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "1a6d1365-1408-4246-b172-881b465eddcc" (UID: "1a6d1365-1408-4246-b172-881b465eddcc"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:11.437486 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.437464 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "1a6d1365-1408-4246-b172-881b465eddcc" (UID: "1a6d1365-1408-4246-b172-881b465eddcc"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:11.438729 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.438692 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "1a6d1365-1408-4246-b172-881b465eddcc" (UID: "1a6d1365-1408-4246-b172-881b465eddcc"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:11.438817 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.438776 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a6d1365-1408-4246-b172-881b465eddcc-config-out" (OuterVolumeSpecName: "config-out") pod "1a6d1365-1408-4246-b172-881b465eddcc" (UID: "1a6d1365-1408-4246-b172-881b465eddcc"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:34:11.438889 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.438851 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a6d1365-1408-4246-b172-881b465eddcc-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "1a6d1365-1408-4246-b172-881b465eddcc" (UID: "1a6d1365-1408-4246-b172-881b465eddcc"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:34:11.438889 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.438867 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "1a6d1365-1408-4246-b172-881b465eddcc" (UID: "1a6d1365-1408-4246-b172-881b465eddcc"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:11.439009 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.438891 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-config" (OuterVolumeSpecName: "config") pod "1a6d1365-1408-4246-b172-881b465eddcc" (UID: "1a6d1365-1408-4246-b172-881b465eddcc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:11.439425 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.439398 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "1a6d1365-1408-4246-b172-881b465eddcc" (UID: "1a6d1365-1408-4246-b172-881b465eddcc"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:11.439513 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.439489 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a6d1365-1408-4246-b172-881b465eddcc-kube-api-access-4mgkn" (OuterVolumeSpecName: "kube-api-access-4mgkn") pod "1a6d1365-1408-4246-b172-881b465eddcc" (UID: "1a6d1365-1408-4246-b172-881b465eddcc"). InnerVolumeSpecName "kube-api-access-4mgkn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:34:11.439827 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.439806 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "1a6d1365-1408-4246-b172-881b465eddcc" (UID: "1a6d1365-1408-4246-b172-881b465eddcc"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:11.440463 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.440445 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "1a6d1365-1408-4246-b172-881b465eddcc" (UID: "1a6d1365-1408-4246-b172-881b465eddcc"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:11.448040 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.448020 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-web-config" (OuterVolumeSpecName: "web-config") pod "1a6d1365-1408-4246-b172-881b465eddcc" (UID: "1a6d1365-1408-4246-b172-881b465eddcc"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:11.535478 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.535421 2569 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:11.535478 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.535473 2569 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-kube-rbac-proxy\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:11.535478 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.535486 2569 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:11.535478 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.535497 2569 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a6d1365-1408-4246-b172-881b465eddcc-tls-assets\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:11.535750 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.535507 2569 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:11.535750 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.535517 2569 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-grpc-tls\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:11.535750 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.535526 2569 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-thanos-prometheus-http-client-file\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:11.535750 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.535536 2569 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a6d1365-1408-4246-b172-881b465eddcc-config-out\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:11.535750 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.535544 2569 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1a6d1365-1408-4246-b172-881b465eddcc-prometheus-k8s-db\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:11.535750 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.535552 2569 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:11.535750 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.535561 2569 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:11.535750 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.535570 2569 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-prometheus-k8s-tls\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:11.535750 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.535578 2569 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-secret-metrics-client-certs\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:11.535750 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.535588 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4mgkn\" (UniqueName: \"kubernetes.io/projected/1a6d1365-1408-4246-b172-881b465eddcc-kube-api-access-4mgkn\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:11.535750 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.535596 2569 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-web-config\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:11.535750 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.535605 2569 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a6d1365-1408-4246-b172-881b465eddcc-configmap-metrics-client-ca\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:11.535750 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.535614 2569 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a6d1365-1408-4246-b172-881b465eddcc-config\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:11.619079 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.619037 2569 generic.go:358] "Generic (PLEG): container finished" podID="1a6d1365-1408-4246-b172-881b465eddcc" containerID="7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05" exitCode=0 Apr 17 16:34:11.619079 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.619072 2569 generic.go:358] "Generic (PLEG): container finished" podID="1a6d1365-1408-4246-b172-881b465eddcc" containerID="32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a" exitCode=0 Apr 17 16:34:11.619079 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.619084 2569 generic.go:358] "Generic (PLEG): container finished" podID="1a6d1365-1408-4246-b172-881b465eddcc" containerID="206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8" exitCode=0 Apr 17 16:34:11.619646 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.619094 2569 generic.go:358] "Generic (PLEG): container finished" podID="1a6d1365-1408-4246-b172-881b465eddcc" containerID="445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972" exitCode=0 Apr 17 16:34:11.619646 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.619102 2569 generic.go:358] "Generic (PLEG): container finished" podID="1a6d1365-1408-4246-b172-881b465eddcc" containerID="48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b" exitCode=0 Apr 17 16:34:11.619646 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.619112 2569 generic.go:358] "Generic (PLEG): container finished" podID="1a6d1365-1408-4246-b172-881b465eddcc" containerID="ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5" exitCode=0 Apr 17 16:34:11.619646 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.619127 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a6d1365-1408-4246-b172-881b465eddcc","Type":"ContainerDied","Data":"7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05"} Apr 17 16:34:11.619646 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.619175 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a6d1365-1408-4246-b172-881b465eddcc","Type":"ContainerDied","Data":"32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a"} Apr 17 16:34:11.619646 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.619193 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a6d1365-1408-4246-b172-881b465eddcc","Type":"ContainerDied","Data":"206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8"} Apr 17 16:34:11.619646 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.619197 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.619646 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.619208 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a6d1365-1408-4246-b172-881b465eddcc","Type":"ContainerDied","Data":"445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972"} Apr 17 16:34:11.619646 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.619225 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a6d1365-1408-4246-b172-881b465eddcc","Type":"ContainerDied","Data":"48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b"} Apr 17 16:34:11.619646 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.619240 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a6d1365-1408-4246-b172-881b465eddcc","Type":"ContainerDied","Data":"ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5"} Apr 17 16:34:11.619646 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.619270 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a6d1365-1408-4246-b172-881b465eddcc","Type":"ContainerDied","Data":"cda95809aac2d0f1a33ff643c9fb99c947b2e3c692245dba63aa010bf946e38e"} Apr 17 16:34:11.619646 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.619264 2569 scope.go:117] "RemoveContainer" containerID="7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05" Apr 17 16:34:11.627200 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.627173 2569 scope.go:117] "RemoveContainer" containerID="32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a" Apr 17 16:34:11.633681 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.633663 2569 scope.go:117] "RemoveContainer" containerID="206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8" Apr 17 16:34:11.640018 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.640001 2569 scope.go:117] "RemoveContainer" containerID="445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972" Apr 17 16:34:11.644147 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.644125 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:34:11.648051 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.648032 2569 scope.go:117] "RemoveContainer" containerID="48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b" Apr 17 16:34:11.649760 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.649741 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:34:11.655018 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.655003 2569 scope.go:117] "RemoveContainer" containerID="ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5" Apr 17 16:34:11.661691 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.661672 2569 scope.go:117] "RemoveContainer" containerID="59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5" Apr 17 16:34:11.667829 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.667812 2569 scope.go:117] "RemoveContainer" containerID="7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05" Apr 17 16:34:11.668065 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:34:11.668046 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05\": container with ID starting with 7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05 not found: ID does not exist" containerID="7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05" Apr 17 16:34:11.668108 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.668074 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05"} err="failed to get container status \"7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05\": rpc error: code = NotFound desc = could not find container \"7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05\": container with ID starting with 7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05 not found: ID does not exist" Apr 17 16:34:11.668108 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.668092 2569 scope.go:117] "RemoveContainer" containerID="32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a" Apr 17 16:34:11.668361 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:34:11.668341 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a\": container with ID starting with 32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a not found: ID does not exist" containerID="32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a" Apr 17 16:34:11.668455 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.668370 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a"} err="failed to get container status \"32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a\": rpc error: code = NotFound desc = could not find container \"32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a\": container with ID starting with 32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a not found: ID does not exist" Apr 17 16:34:11.668455 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.668395 2569 scope.go:117] "RemoveContainer" containerID="206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8" Apr 17 16:34:11.668688 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:34:11.668666 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8\": container with ID starting with 206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8 not found: ID does not exist" containerID="206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8" Apr 17 16:34:11.668792 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.668694 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8"} err="failed to get container status \"206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8\": rpc error: code = NotFound desc = could not find container \"206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8\": container with ID starting with 206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8 not found: ID does not exist" Apr 17 16:34:11.668792 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.668714 2569 scope.go:117] "RemoveContainer" containerID="445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972" Apr 17 16:34:11.669096 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:34:11.669011 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972\": container with ID starting with 445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972 not found: ID does not exist" containerID="445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972" Apr 17 16:34:11.669096 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.669064 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972"} err="failed to get container status \"445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972\": rpc error: code = NotFound desc = could not find container \"445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972\": container with ID starting with 445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972 not found: ID does not exist" Apr 17 16:34:11.669096 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.669086 2569 scope.go:117] "RemoveContainer" containerID="48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b" Apr 17 16:34:11.669382 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:34:11.669361 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b\": container with ID starting with 48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b not found: ID does not exist" containerID="48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b" Apr 17 16:34:11.669435 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.669391 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b"} err="failed to get container status \"48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b\": rpc error: code = NotFound desc = could not find container \"48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b\": container with ID starting with 48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b not found: ID does not exist" Apr 17 16:34:11.669435 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.669411 2569 scope.go:117] "RemoveContainer" containerID="ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5" Apr 17 16:34:11.669642 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:34:11.669625 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5\": container with ID starting with ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5 not found: ID does not exist" containerID="ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5" Apr 17 16:34:11.669679 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.669649 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5"} err="failed to get container status \"ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5\": rpc error: code = NotFound desc = could not find container \"ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5\": container with ID starting with ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5 not found: ID does not exist" Apr 17 16:34:11.669679 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.669666 2569 scope.go:117] "RemoveContainer" containerID="59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5" Apr 17 16:34:11.669917 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:34:11.669901 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5\": container with ID starting with 59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5 not found: ID does not exist" containerID="59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5" Apr 17 16:34:11.669956 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.669921 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5"} err="failed to get container status \"59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5\": rpc error: code = NotFound desc = could not find container \"59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5\": container with ID starting with 59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5 not found: ID does not exist" Apr 17 16:34:11.669956 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.669938 2569 scope.go:117] "RemoveContainer" containerID="7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05" Apr 17 16:34:11.670175 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.670157 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05"} err="failed to get container status \"7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05\": rpc error: code = NotFound desc = could not find container \"7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05\": container with ID starting with 7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05 not found: ID does not exist" Apr 17 16:34:11.670216 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.670178 2569 scope.go:117] "RemoveContainer" containerID="32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a" Apr 17 16:34:11.670417 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.670400 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a"} err="failed to get container status \"32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a\": rpc error: code = NotFound desc = could not find container \"32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a\": container with ID starting with 32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a not found: ID does not exist" Apr 17 16:34:11.670465 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.670417 2569 scope.go:117] "RemoveContainer" containerID="206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8" Apr 17 16:34:11.670612 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.670595 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8"} err="failed to get container status \"206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8\": rpc error: code = NotFound desc = could not find container \"206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8\": container with ID starting with 206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8 not found: ID does not exist" Apr 17 16:34:11.670653 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.670613 2569 scope.go:117] "RemoveContainer" containerID="445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972" Apr 17 16:34:11.670842 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.670822 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972"} err="failed to get container status \"445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972\": rpc error: code = NotFound desc = could not find container \"445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972\": container with ID starting with 445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972 not found: ID does not exist" Apr 17 16:34:11.670890 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.670842 2569 scope.go:117] "RemoveContainer" containerID="48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b" Apr 17 16:34:11.671067 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.671046 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b"} err="failed to get container status \"48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b\": rpc error: code = NotFound desc = could not find container \"48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b\": container with ID starting with 48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b not found: ID does not exist" Apr 17 16:34:11.671105 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.671068 2569 scope.go:117] "RemoveContainer" containerID="ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5" Apr 17 16:34:11.671345 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.671245 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5"} err="failed to get container status \"ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5\": rpc error: code = NotFound desc = could not find container \"ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5\": container with ID starting with ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5 not found: ID does not exist" Apr 17 16:34:11.671345 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.671282 2569 scope.go:117] "RemoveContainer" containerID="59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5" Apr 17 16:34:11.671522 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.671504 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5"} err="failed to get container status \"59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5\": rpc error: code = NotFound desc = could not find container \"59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5\": container with ID starting with 59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5 not found: ID does not exist" Apr 17 16:34:11.671569 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.671523 2569 scope.go:117] "RemoveContainer" containerID="7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05" Apr 17 16:34:11.671724 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.671709 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05"} err="failed to get container status \"7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05\": rpc error: code = NotFound desc = could not find container \"7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05\": container with ID starting with 7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05 not found: ID does not exist" Apr 17 16:34:11.671724 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.671724 2569 scope.go:117] "RemoveContainer" containerID="32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a" Apr 17 16:34:11.671912 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.671897 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a"} err="failed to get container status \"32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a\": rpc error: code = NotFound desc = could not find container \"32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a\": container with ID starting with 32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a not found: ID does not exist" Apr 17 16:34:11.671956 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.671912 2569 scope.go:117] "RemoveContainer" containerID="206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8" Apr 17 16:34:11.672099 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.672083 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8"} err="failed to get container status \"206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8\": rpc error: code = NotFound desc = could not find container \"206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8\": container with ID starting with 206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8 not found: ID does not exist" Apr 17 16:34:11.672148 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.672099 2569 scope.go:117] "RemoveContainer" containerID="445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972" Apr 17 16:34:11.672281 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.672266 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972"} err="failed to get container status \"445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972\": rpc error: code = NotFound desc = could not find container \"445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972\": container with ID starting with 445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972 not found: ID does not exist" Apr 17 16:34:11.672281 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.672280 2569 scope.go:117] "RemoveContainer" containerID="48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b" Apr 17 16:34:11.672490 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.672472 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b"} err="failed to get container status \"48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b\": rpc error: code = NotFound desc = could not find container \"48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b\": container with ID starting with 48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b not found: ID does not exist" Apr 17 16:34:11.672526 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.672492 2569 scope.go:117] "RemoveContainer" containerID="ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5" Apr 17 16:34:11.672693 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.672676 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5"} err="failed to get container status \"ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5\": rpc error: code = NotFound desc = could not find container \"ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5\": container with ID starting with ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5 not found: ID does not exist" Apr 17 16:34:11.672761 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.672693 2569 scope.go:117] "RemoveContainer" containerID="59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5" Apr 17 16:34:11.672912 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.672892 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5"} err="failed to get container status \"59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5\": rpc error: code = NotFound desc = could not find container \"59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5\": container with ID starting with 59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5 not found: ID does not exist" Apr 17 16:34:11.672953 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.672912 2569 scope.go:117] "RemoveContainer" containerID="7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05" Apr 17 16:34:11.673106 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.673090 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05"} err="failed to get container status \"7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05\": rpc error: code = NotFound desc = could not find container \"7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05\": container with ID starting with 7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05 not found: ID does not exist" Apr 17 16:34:11.673155 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.673106 2569 scope.go:117] "RemoveContainer" containerID="32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a" Apr 17 16:34:11.673297 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.673281 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a"} err="failed to get container status \"32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a\": rpc error: code = NotFound desc = could not find container \"32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a\": container with ID starting with 32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a not found: ID does not exist" Apr 17 16:34:11.673345 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.673297 2569 scope.go:117] "RemoveContainer" containerID="206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8" Apr 17 16:34:11.673494 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.673479 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8"} err="failed to get container status \"206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8\": rpc error: code = NotFound desc = could not find container \"206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8\": container with ID starting with 206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8 not found: ID does not exist" Apr 17 16:34:11.673494 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.673493 2569 scope.go:117] "RemoveContainer" containerID="445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972" Apr 17 16:34:11.673687 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.673673 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972"} err="failed to get container status \"445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972\": rpc error: code = NotFound desc = could not find container \"445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972\": container with ID starting with 445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972 not found: ID does not exist" Apr 17 16:34:11.673724 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.673688 2569 scope.go:117] "RemoveContainer" containerID="48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b" Apr 17 16:34:11.673878 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.673862 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b"} err="failed to get container status \"48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b\": rpc error: code = NotFound desc = could not find container \"48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b\": container with ID starting with 48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b not found: ID does not exist" Apr 17 16:34:11.673931 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.673878 2569 scope.go:117] "RemoveContainer" containerID="ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5" Apr 17 16:34:11.674055 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.674040 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5"} err="failed to get container status \"ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5\": rpc error: code = NotFound desc = could not find container \"ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5\": container with ID starting with ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5 not found: ID does not exist" Apr 17 16:34:11.674094 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.674056 2569 scope.go:117] "RemoveContainer" containerID="59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5" Apr 17 16:34:11.674231 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.674217 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5"} err="failed to get container status \"59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5\": rpc error: code = NotFound desc = could not find container \"59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5\": container with ID starting with 59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5 not found: ID does not exist" Apr 17 16:34:11.674231 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.674231 2569 scope.go:117] "RemoveContainer" containerID="7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05" Apr 17 16:34:11.674431 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.674415 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05"} err="failed to get container status \"7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05\": rpc error: code = NotFound desc = could not find container \"7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05\": container with ID starting with 7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05 not found: ID does not exist" Apr 17 16:34:11.674431 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.674431 2569 scope.go:117] "RemoveContainer" containerID="32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a" Apr 17 16:34:11.674598 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.674582 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a"} err="failed to get container status \"32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a\": rpc error: code = NotFound desc = could not find container \"32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a\": container with ID starting with 32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a not found: ID does not exist" Apr 17 16:34:11.674634 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.674597 2569 scope.go:117] "RemoveContainer" containerID="206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8" Apr 17 16:34:11.674772 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.674750 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8"} err="failed to get container status \"206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8\": rpc error: code = NotFound desc = could not find container \"206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8\": container with ID starting with 206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8 not found: ID does not exist" Apr 17 16:34:11.674772 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.674771 2569 scope.go:117] "RemoveContainer" containerID="445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972" Apr 17 16:34:11.674960 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.674944 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972"} err="failed to get container status \"445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972\": rpc error: code = NotFound desc = could not find container \"445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972\": container with ID starting with 445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972 not found: ID does not exist" Apr 17 16:34:11.674960 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.674959 2569 scope.go:117] "RemoveContainer" containerID="48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b" Apr 17 16:34:11.675130 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.675114 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b"} err="failed to get container status \"48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b\": rpc error: code = NotFound desc = could not find container \"48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b\": container with ID starting with 48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b not found: ID does not exist" Apr 17 16:34:11.675130 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.675129 2569 scope.go:117] "RemoveContainer" containerID="ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5" Apr 17 16:34:11.675337 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.675319 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5"} err="failed to get container status \"ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5\": rpc error: code = NotFound desc = could not find container \"ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5\": container with ID starting with ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5 not found: ID does not exist" Apr 17 16:34:11.675337 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.675337 2569 scope.go:117] "RemoveContainer" containerID="59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5" Apr 17 16:34:11.675523 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.675508 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5"} err="failed to get container status \"59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5\": rpc error: code = NotFound desc = could not find container \"59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5\": container with ID starting with 59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5 not found: ID does not exist" Apr 17 16:34:11.675560 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.675524 2569 scope.go:117] "RemoveContainer" containerID="7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05" Apr 17 16:34:11.675707 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.675691 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05"} err="failed to get container status \"7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05\": rpc error: code = NotFound desc = could not find container \"7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05\": container with ID starting with 7256be4e55d91aaee1ef3a1dbb0f5df328a0ccc6baa26c846f62636d8f9e3e05 not found: ID does not exist" Apr 17 16:34:11.675707 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.675706 2569 scope.go:117] "RemoveContainer" containerID="32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a" Apr 17 16:34:11.675896 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.675879 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a"} err="failed to get container status \"32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a\": rpc error: code = NotFound desc = could not find container \"32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a\": container with ID starting with 32d9596aacc9fda3868c288ad185c42f594806f9c2e6679810fd2177ccf79c5a not found: ID does not exist" Apr 17 16:34:11.675934 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.675898 2569 scope.go:117] "RemoveContainer" containerID="206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8" Apr 17 16:34:11.678836 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.678464 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8"} err="failed to get container status \"206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8\": rpc error: code = NotFound desc = could not find container \"206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8\": container with ID starting with 206f2bde7d1703be26ef98cd0c920a981c519870e95f2b595e9b4a6365f436f8 not found: ID does not exist" Apr 17 16:34:11.678836 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.678494 2569 scope.go:117] "RemoveContainer" containerID="445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972" Apr 17 16:34:11.679224 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.679204 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972"} err="failed to get container status \"445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972\": rpc error: code = NotFound desc = could not find container \"445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972\": container with ID starting with 445abb9c4454e3893aa14e96d5906ab32dec9e389a257b643ac073b3a6bd3972 not found: ID does not exist" Apr 17 16:34:11.679224 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.679224 2569 scope.go:117] "RemoveContainer" containerID="48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b" Apr 17 16:34:11.679504 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.679485 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b"} err="failed to get container status \"48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b\": rpc error: code = NotFound desc = could not find container \"48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b\": container with ID starting with 48decb8f6c104f6900c226c0882ea71347a428f7b8ae9d55167578eded71622b not found: ID does not exist" Apr 17 16:34:11.679504 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.679504 2569 scope.go:117] "RemoveContainer" containerID="ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5" Apr 17 16:34:11.679720 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.679702 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5"} err="failed to get container status \"ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5\": rpc error: code = NotFound desc = could not find container \"ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5\": container with ID starting with ec806488685afbba0f9e9d8dedfecf8c5377a879baf240e94188a0a1c83e28b5 not found: ID does not exist" Apr 17 16:34:11.679787 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.679721 2569 scope.go:117] "RemoveContainer" containerID="59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5" Apr 17 16:34:11.680000 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.679973 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5"} err="failed to get container status \"59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5\": rpc error: code = NotFound desc = could not find container \"59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5\": container with ID starting with 59ea6d4938374c796eaf7bce6f79887e9c81d0ea029de097f9664a672abd81d5 not found: ID does not exist" Apr 17 16:34:11.680397 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.680380 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:34:11.680688 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.680677 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a6d1365-1408-4246-b172-881b465eddcc" containerName="thanos-sidecar" Apr 17 16:34:11.680733 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.680689 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a6d1365-1408-4246-b172-881b465eddcc" containerName="thanos-sidecar" Apr 17 16:34:11.680733 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.680697 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a6d1365-1408-4246-b172-881b465eddcc" containerName="kube-rbac-proxy" Apr 17 16:34:11.680733 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.680702 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a6d1365-1408-4246-b172-881b465eddcc" containerName="kube-rbac-proxy" Apr 17 16:34:11.680733 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.680712 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a6d1365-1408-4246-b172-881b465eddcc" containerName="config-reloader" Apr 17 16:34:11.680733 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.680718 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a6d1365-1408-4246-b172-881b465eddcc" containerName="config-reloader" Apr 17 16:34:11.680733 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.680724 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a6d1365-1408-4246-b172-881b465eddcc" containerName="kube-rbac-proxy-thanos" Apr 17 16:34:11.680733 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.680730 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a6d1365-1408-4246-b172-881b465eddcc" containerName="kube-rbac-proxy-thanos" Apr 17 16:34:11.680924 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.680736 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a6d1365-1408-4246-b172-881b465eddcc" containerName="init-config-reloader" Apr 17 16:34:11.680924 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.680742 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a6d1365-1408-4246-b172-881b465eddcc" containerName="init-config-reloader" Apr 17 16:34:11.680924 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.680755 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a6d1365-1408-4246-b172-881b465eddcc" containerName="prometheus" Apr 17 16:34:11.680924 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.680760 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a6d1365-1408-4246-b172-881b465eddcc" containerName="prometheus" Apr 17 16:34:11.680924 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.680767 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a6d1365-1408-4246-b172-881b465eddcc" containerName="kube-rbac-proxy-web" Apr 17 16:34:11.680924 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.680771 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a6d1365-1408-4246-b172-881b465eddcc" containerName="kube-rbac-proxy-web" Apr 17 16:34:11.680924 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.680814 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a6d1365-1408-4246-b172-881b465eddcc" containerName="prometheus" Apr 17 16:34:11.680924 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.680820 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a6d1365-1408-4246-b172-881b465eddcc" containerName="config-reloader" Apr 17 16:34:11.680924 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.680830 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a6d1365-1408-4246-b172-881b465eddcc" containerName="kube-rbac-proxy-web" Apr 17 16:34:11.680924 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.680837 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a6d1365-1408-4246-b172-881b465eddcc" containerName="thanos-sidecar" Apr 17 16:34:11.680924 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.680842 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a6d1365-1408-4246-b172-881b465eddcc" containerName="kube-rbac-proxy" Apr 17 16:34:11.680924 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.680848 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a6d1365-1408-4246-b172-881b465eddcc" containerName="kube-rbac-proxy-thanos" Apr 17 16:34:11.686138 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.686092 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.690679 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.690660 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 16:34:11.690863 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.690844 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 16:34:11.691008 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.690989 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 16:34:11.691071 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.691063 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-dm68br4pi30vp\"" Apr 17 16:34:11.691136 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.691121 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 16:34:11.691468 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.691418 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 16:34:11.691468 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.691430 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 16:34:11.691641 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.691508 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 16:34:11.691641 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.691591 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 16:34:11.691813 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.691698 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-jpnr9\"" Apr 17 16:34:11.692172 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.692157 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 16:34:11.694572 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.694553 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 16:34:11.694673 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.694555 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 16:34:11.695822 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.695804 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 16:34:11.702763 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.702743 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 16:34:11.706437 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.706417 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:34:11.838961 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.838921 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.838961 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.838962 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.839146 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.838992 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.839146 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.839026 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.839146 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.839044 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.839146 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.839064 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-config\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.839146 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.839087 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-config-out\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.839146 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.839106 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.839146 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.839140 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.839388 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.839158 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrf77\" (UniqueName: \"kubernetes.io/projected/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-kube-api-access-xrf77\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.839388 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.839178 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.839388 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.839201 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.839388 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.839227 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.839388 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.839244 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.839388 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.839304 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.839388 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.839326 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.839583 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.839380 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-web-config\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.839583 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.839419 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.940345 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.940241 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.940345 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.940299 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.940345 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.940321 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-web-config\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.940345 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.940347 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.940650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.940375 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.940650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.940394 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.940650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.940445 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.940650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.940498 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.940650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.940529 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.940650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.940563 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-config\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.940650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.940592 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-config-out\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.940650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.940640 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.941025 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.940674 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.941025 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.940697 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.941025 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.940701 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrf77\" (UniqueName: \"kubernetes.io/projected/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-kube-api-access-xrf77\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.941025 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.940760 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.941025 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.940797 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.941025 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.940851 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.941025 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.940883 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.941386 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.941086 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.941879 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.941541 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.941879 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.941801 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.943555 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.943524 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-web-config\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.943647 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.943596 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.943761 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.943739 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.943853 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.943833 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.943911 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.943856 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.943911 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.943835 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.944261 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.944222 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-config-out\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.944759 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.944738 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.944962 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.944939 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.945603 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.945581 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.945848 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.945832 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-config\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.946232 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.946210 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.946899 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.946882 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.950093 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.950074 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrf77\" (UniqueName: \"kubernetes.io/projected/ac0c672e-04a0-4fbb-ae4d-8f52f14c74be-kube-api-access-xrf77\") pod \"prometheus-k8s-0\" (UID: \"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:11.995719 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:11.995676 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:12.134590 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:12.134563 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:34:12.136473 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:34:12.136443 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac0c672e_04a0_4fbb_ae4d_8f52f14c74be.slice/crio-292b1284e3a7d874fcb578df70f1cc6dc58dc4a12aae6161dbd0ee95fec0ef20 WatchSource:0}: Error finding container 292b1284e3a7d874fcb578df70f1cc6dc58dc4a12aae6161dbd0ee95fec0ef20: Status 404 returned error can't find the container with id 292b1284e3a7d874fcb578df70f1cc6dc58dc4a12aae6161dbd0ee95fec0ef20 Apr 17 16:34:12.624897 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:12.624866 2569 generic.go:358] "Generic (PLEG): container finished" podID="ac0c672e-04a0-4fbb-ae4d-8f52f14c74be" containerID="173cfcd139b0f12af7bd49be05a57118ee053c10e2383e9e609dacfddffedff3" exitCode=0 Apr 17 16:34:12.625267 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:12.624938 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be","Type":"ContainerDied","Data":"173cfcd139b0f12af7bd49be05a57118ee053c10e2383e9e609dacfddffedff3"} Apr 17 16:34:12.625267 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:12.624957 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be","Type":"ContainerStarted","Data":"292b1284e3a7d874fcb578df70f1cc6dc58dc4a12aae6161dbd0ee95fec0ef20"} Apr 17 16:34:13.042741 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:13.042702 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a6d1365-1408-4246-b172-881b465eddcc" path="/var/lib/kubelet/pods/1a6d1365-1408-4246-b172-881b465eddcc/volumes" Apr 17 16:34:13.631739 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:13.631703 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be","Type":"ContainerStarted","Data":"21b5b0a278dd4440a98255116bd949f06b9ad5dd597df66336c6dcc0fc1b2c64"} Apr 17 16:34:13.631739 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:13.631741 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be","Type":"ContainerStarted","Data":"b6893a412fe57337a6955e15cc550c201d37421025b2f8152dd2f2e69b82edb9"} Apr 17 16:34:13.632206 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:13.631754 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be","Type":"ContainerStarted","Data":"dae6de943c2ee0dad3ef8f0479e96a20a874cd02fbaa2c922fed5476fe05cd75"} Apr 17 16:34:13.632206 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:13.631766 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be","Type":"ContainerStarted","Data":"e1b26dd231ffeeb46a8c3eeef7e7b07daab91d61a3452c19aa8e0f0cffd05785"} Apr 17 16:34:13.632206 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:13.631777 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be","Type":"ContainerStarted","Data":"ee7e09a666a24300f8e2702dfa0b19bd5473c863abffad8a9008ff872339b905"} Apr 17 16:34:13.632206 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:13.631787 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ac0c672e-04a0-4fbb-ae4d-8f52f14c74be","Type":"ContainerStarted","Data":"aa9f33e400fa521206899c120c3e6f41a02cab2db3cce482b1c4baa9cf25f515"} Apr 17 16:34:13.663464 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:13.663400 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.663380068 podStartE2EDuration="2.663380068s" podCreationTimestamp="2026-04-17 16:34:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:34:13.660965137 +0000 UTC m=+151.183990721" watchObservedRunningTime="2026-04-17 16:34:13.663380068 +0000 UTC m=+151.186405654" Apr 17 16:34:16.996796 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:16.996755 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:25.964671 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:25.964633 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-65dd5ccdb8-mdkmx"] Apr 17 16:34:50.985132 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:50.985062 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-65dd5ccdb8-mdkmx" podUID="052b105d-312b-43e7-8b4b-e38f8fc75abf" containerName="console" containerID="cri-o://7aac28cff3e889789c3aa0e8c948d98afc8050a23954c746b3fc56cce1cb223d" gracePeriod=15 Apr 17 16:34:51.225241 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.225218 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65dd5ccdb8-mdkmx_052b105d-312b-43e7-8b4b-e38f8fc75abf/console/0.log" Apr 17 16:34:51.225388 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.225298 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:34:51.357705 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.357673 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/052b105d-312b-43e7-8b4b-e38f8fc75abf-console-serving-cert\") pod \"052b105d-312b-43e7-8b4b-e38f8fc75abf\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " Apr 17 16:34:51.357880 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.357714 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/052b105d-312b-43e7-8b4b-e38f8fc75abf-console-oauth-config\") pod \"052b105d-312b-43e7-8b4b-e38f8fc75abf\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " Apr 17 16:34:51.357880 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.357761 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/052b105d-312b-43e7-8b4b-e38f8fc75abf-oauth-serving-cert\") pod \"052b105d-312b-43e7-8b4b-e38f8fc75abf\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " Apr 17 16:34:51.357880 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.357787 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/052b105d-312b-43e7-8b4b-e38f8fc75abf-service-ca\") pod \"052b105d-312b-43e7-8b4b-e38f8fc75abf\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " Apr 17 16:34:51.357880 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.357821 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt2fp\" (UniqueName: \"kubernetes.io/projected/052b105d-312b-43e7-8b4b-e38f8fc75abf-kube-api-access-wt2fp\") pod \"052b105d-312b-43e7-8b4b-e38f8fc75abf\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " Apr 17 16:34:51.357880 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.357839 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/052b105d-312b-43e7-8b4b-e38f8fc75abf-console-config\") pod \"052b105d-312b-43e7-8b4b-e38f8fc75abf\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " Apr 17 16:34:51.357880 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.357865 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/052b105d-312b-43e7-8b4b-e38f8fc75abf-trusted-ca-bundle\") pod \"052b105d-312b-43e7-8b4b-e38f8fc75abf\" (UID: \"052b105d-312b-43e7-8b4b-e38f8fc75abf\") " Apr 17 16:34:51.358238 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.358210 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/052b105d-312b-43e7-8b4b-e38f8fc75abf-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "052b105d-312b-43e7-8b4b-e38f8fc75abf" (UID: "052b105d-312b-43e7-8b4b-e38f8fc75abf"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:51.358334 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.358228 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/052b105d-312b-43e7-8b4b-e38f8fc75abf-service-ca" (OuterVolumeSpecName: "service-ca") pod "052b105d-312b-43e7-8b4b-e38f8fc75abf" (UID: "052b105d-312b-43e7-8b4b-e38f8fc75abf"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:51.358412 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.358356 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/052b105d-312b-43e7-8b4b-e38f8fc75abf-console-config" (OuterVolumeSpecName: "console-config") pod "052b105d-312b-43e7-8b4b-e38f8fc75abf" (UID: "052b105d-312b-43e7-8b4b-e38f8fc75abf"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:51.358526 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.358506 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/052b105d-312b-43e7-8b4b-e38f8fc75abf-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "052b105d-312b-43e7-8b4b-e38f8fc75abf" (UID: "052b105d-312b-43e7-8b4b-e38f8fc75abf"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:51.360082 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.360053 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/052b105d-312b-43e7-8b4b-e38f8fc75abf-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "052b105d-312b-43e7-8b4b-e38f8fc75abf" (UID: "052b105d-312b-43e7-8b4b-e38f8fc75abf"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:51.360400 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.360380 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/052b105d-312b-43e7-8b4b-e38f8fc75abf-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "052b105d-312b-43e7-8b4b-e38f8fc75abf" (UID: "052b105d-312b-43e7-8b4b-e38f8fc75abf"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:51.360465 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.360384 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/052b105d-312b-43e7-8b4b-e38f8fc75abf-kube-api-access-wt2fp" (OuterVolumeSpecName: "kube-api-access-wt2fp") pod "052b105d-312b-43e7-8b4b-e38f8fc75abf" (UID: "052b105d-312b-43e7-8b4b-e38f8fc75abf"). InnerVolumeSpecName "kube-api-access-wt2fp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:34:51.459066 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.459027 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/052b105d-312b-43e7-8b4b-e38f8fc75abf-console-serving-cert\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:51.459066 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.459059 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/052b105d-312b-43e7-8b4b-e38f8fc75abf-console-oauth-config\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:51.459066 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.459069 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/052b105d-312b-43e7-8b4b-e38f8fc75abf-oauth-serving-cert\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:51.459326 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.459080 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/052b105d-312b-43e7-8b4b-e38f8fc75abf-service-ca\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:51.459326 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.459089 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wt2fp\" (UniqueName: \"kubernetes.io/projected/052b105d-312b-43e7-8b4b-e38f8fc75abf-kube-api-access-wt2fp\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:51.459326 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.459098 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/052b105d-312b-43e7-8b4b-e38f8fc75abf-console-config\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:51.459326 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.459107 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/052b105d-312b-43e7-8b4b-e38f8fc75abf-trusted-ca-bundle\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:34:51.750615 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.750530 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65dd5ccdb8-mdkmx_052b105d-312b-43e7-8b4b-e38f8fc75abf/console/0.log" Apr 17 16:34:51.750615 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.750570 2569 generic.go:358] "Generic (PLEG): container finished" podID="052b105d-312b-43e7-8b4b-e38f8fc75abf" containerID="7aac28cff3e889789c3aa0e8c948d98afc8050a23954c746b3fc56cce1cb223d" exitCode=2 Apr 17 16:34:51.750819 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.750604 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65dd5ccdb8-mdkmx" event={"ID":"052b105d-312b-43e7-8b4b-e38f8fc75abf","Type":"ContainerDied","Data":"7aac28cff3e889789c3aa0e8c948d98afc8050a23954c746b3fc56cce1cb223d"} Apr 17 16:34:51.750819 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.750640 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65dd5ccdb8-mdkmx" Apr 17 16:34:51.750819 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.750646 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65dd5ccdb8-mdkmx" event={"ID":"052b105d-312b-43e7-8b4b-e38f8fc75abf","Type":"ContainerDied","Data":"5c7a9c52f45e48643383fa2ba6260cec7762bef97d63a29efcb7d7a71f2f730e"} Apr 17 16:34:51.750819 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.750664 2569 scope.go:117] "RemoveContainer" containerID="7aac28cff3e889789c3aa0e8c948d98afc8050a23954c746b3fc56cce1cb223d" Apr 17 16:34:51.760135 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.760117 2569 scope.go:117] "RemoveContainer" containerID="7aac28cff3e889789c3aa0e8c948d98afc8050a23954c746b3fc56cce1cb223d" Apr 17 16:34:51.760432 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:34:51.760414 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aac28cff3e889789c3aa0e8c948d98afc8050a23954c746b3fc56cce1cb223d\": container with ID starting with 7aac28cff3e889789c3aa0e8c948d98afc8050a23954c746b3fc56cce1cb223d not found: ID does not exist" containerID="7aac28cff3e889789c3aa0e8c948d98afc8050a23954c746b3fc56cce1cb223d" Apr 17 16:34:51.760489 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.760442 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aac28cff3e889789c3aa0e8c948d98afc8050a23954c746b3fc56cce1cb223d"} err="failed to get container status \"7aac28cff3e889789c3aa0e8c948d98afc8050a23954c746b3fc56cce1cb223d\": rpc error: code = NotFound desc = could not find container \"7aac28cff3e889789c3aa0e8c948d98afc8050a23954c746b3fc56cce1cb223d\": container with ID starting with 7aac28cff3e889789c3aa0e8c948d98afc8050a23954c746b3fc56cce1cb223d not found: ID does not exist" Apr 17 16:34:51.774235 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.774205 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-65dd5ccdb8-mdkmx"] Apr 17 16:34:51.782387 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:51.782363 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-65dd5ccdb8-mdkmx"] Apr 17 16:34:53.041272 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:34:53.041230 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="052b105d-312b-43e7-8b4b-e38f8fc75abf" path="/var/lib/kubelet/pods/052b105d-312b-43e7-8b4b-e38f8fc75abf/volumes" Apr 17 16:35:11.996506 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:35:11.996470 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:12.011802 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:35:12.011777 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:12.826449 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:35:12.826422 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:53.791780 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:35:53.791745 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-nkvf5"] Apr 17 16:35:53.792234 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:35:53.792048 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="052b105d-312b-43e7-8b4b-e38f8fc75abf" containerName="console" Apr 17 16:35:53.792234 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:35:53.792059 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="052b105d-312b-43e7-8b4b-e38f8fc75abf" containerName="console" Apr 17 16:35:53.792234 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:35:53.792137 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="052b105d-312b-43e7-8b4b-e38f8fc75abf" containerName="console" Apr 17 16:35:53.795400 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:35:53.795380 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nkvf5" Apr 17 16:35:53.798054 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:35:53.798033 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 16:35:53.803308 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:35:53.803287 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-nkvf5"] Apr 17 16:35:53.886484 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:35:53.886446 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f82da78f-1d1f-490b-b80e-531065555833-kubelet-config\") pod \"global-pull-secret-syncer-nkvf5\" (UID: \"f82da78f-1d1f-490b-b80e-531065555833\") " pod="kube-system/global-pull-secret-syncer-nkvf5" Apr 17 16:35:53.886670 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:35:53.886507 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f82da78f-1d1f-490b-b80e-531065555833-original-pull-secret\") pod \"global-pull-secret-syncer-nkvf5\" (UID: \"f82da78f-1d1f-490b-b80e-531065555833\") " pod="kube-system/global-pull-secret-syncer-nkvf5" Apr 17 16:35:53.886670 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:35:53.886579 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f82da78f-1d1f-490b-b80e-531065555833-dbus\") pod \"global-pull-secret-syncer-nkvf5\" (UID: \"f82da78f-1d1f-490b-b80e-531065555833\") " pod="kube-system/global-pull-secret-syncer-nkvf5" Apr 17 16:35:53.987626 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:35:53.987589 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f82da78f-1d1f-490b-b80e-531065555833-dbus\") pod \"global-pull-secret-syncer-nkvf5\" (UID: \"f82da78f-1d1f-490b-b80e-531065555833\") " pod="kube-system/global-pull-secret-syncer-nkvf5" Apr 17 16:35:53.987822 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:35:53.987648 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f82da78f-1d1f-490b-b80e-531065555833-kubelet-config\") pod \"global-pull-secret-syncer-nkvf5\" (UID: \"f82da78f-1d1f-490b-b80e-531065555833\") " pod="kube-system/global-pull-secret-syncer-nkvf5" Apr 17 16:35:53.987822 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:35:53.987679 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f82da78f-1d1f-490b-b80e-531065555833-original-pull-secret\") pod \"global-pull-secret-syncer-nkvf5\" (UID: \"f82da78f-1d1f-490b-b80e-531065555833\") " pod="kube-system/global-pull-secret-syncer-nkvf5" Apr 17 16:35:53.987822 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:35:53.987788 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f82da78f-1d1f-490b-b80e-531065555833-kubelet-config\") pod \"global-pull-secret-syncer-nkvf5\" (UID: \"f82da78f-1d1f-490b-b80e-531065555833\") " pod="kube-system/global-pull-secret-syncer-nkvf5" Apr 17 16:35:53.987822 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:35:53.987811 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f82da78f-1d1f-490b-b80e-531065555833-dbus\") pod \"global-pull-secret-syncer-nkvf5\" (UID: \"f82da78f-1d1f-490b-b80e-531065555833\") " pod="kube-system/global-pull-secret-syncer-nkvf5" Apr 17 16:35:53.989939 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:35:53.989914 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f82da78f-1d1f-490b-b80e-531065555833-original-pull-secret\") pod \"global-pull-secret-syncer-nkvf5\" (UID: \"f82da78f-1d1f-490b-b80e-531065555833\") " pod="kube-system/global-pull-secret-syncer-nkvf5" Apr 17 16:35:54.105621 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:35:54.105587 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nkvf5" Apr 17 16:35:54.225177 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:35:54.225123 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-nkvf5"] Apr 17 16:35:54.229147 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:35:54.228893 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf82da78f_1d1f_490b_b80e_531065555833.slice/crio-9bb5042da19605a1cdc4f5805aa34351ba4e10eff3677a2d46850809f57a9bc8 WatchSource:0}: Error finding container 9bb5042da19605a1cdc4f5805aa34351ba4e10eff3677a2d46850809f57a9bc8: Status 404 returned error can't find the container with id 9bb5042da19605a1cdc4f5805aa34351ba4e10eff3677a2d46850809f57a9bc8 Apr 17 16:35:54.928772 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:35:54.928736 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-nkvf5" event={"ID":"f82da78f-1d1f-490b-b80e-531065555833","Type":"ContainerStarted","Data":"9bb5042da19605a1cdc4f5805aa34351ba4e10eff3677a2d46850809f57a9bc8"} Apr 17 16:35:57.941121 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:35:57.941081 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-nkvf5" event={"ID":"f82da78f-1d1f-490b-b80e-531065555833","Type":"ContainerStarted","Data":"981e62c51c4a62e9ad97cb3019f9b0fd6449bf28868a7ea6964fd0b036771c5d"} Apr 17 16:35:57.956968 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:35:57.956919 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-nkvf5" podStartSLOduration=1.438148846 podStartE2EDuration="4.95689982s" podCreationTimestamp="2026-04-17 16:35:53 +0000 UTC" firstStartedPulling="2026-04-17 16:35:54.232677951 +0000 UTC m=+251.755703516" lastFinishedPulling="2026-04-17 16:35:57.751428929 +0000 UTC m=+255.274454490" observedRunningTime="2026-04-17 16:35:57.956167752 +0000 UTC m=+255.479193337" watchObservedRunningTime="2026-04-17 16:35:57.95689982 +0000 UTC m=+255.479925404" Apr 17 16:36:42.935767 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:36:42.935737 2569 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 16:39:22.480546 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:22.480512 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-6fmnk"] Apr 17 16:39:22.483450 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:22.483428 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-6fmnk" Apr 17 16:39:22.486480 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:22.486458 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 16:39:22.487546 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:22.487528 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-msj5w\"" Apr 17 16:39:22.487546 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:22.487539 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 17 16:39:22.487704 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:22.487547 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 16:39:22.495321 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:22.495300 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-6fmnk"] Apr 17 16:39:22.614584 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:22.614540 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/267806f9-e950-4f00-80b9-35aa3861db64-cert\") pod \"odh-model-controller-696fc77849-6fmnk\" (UID: \"267806f9-e950-4f00-80b9-35aa3861db64\") " pod="kserve/odh-model-controller-696fc77849-6fmnk" Apr 17 16:39:22.614770 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:22.614601 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf768\" (UniqueName: \"kubernetes.io/projected/267806f9-e950-4f00-80b9-35aa3861db64-kube-api-access-bf768\") pod \"odh-model-controller-696fc77849-6fmnk\" (UID: \"267806f9-e950-4f00-80b9-35aa3861db64\") " pod="kserve/odh-model-controller-696fc77849-6fmnk" Apr 17 16:39:22.715438 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:22.715405 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/267806f9-e950-4f00-80b9-35aa3861db64-cert\") pod \"odh-model-controller-696fc77849-6fmnk\" (UID: \"267806f9-e950-4f00-80b9-35aa3861db64\") " pod="kserve/odh-model-controller-696fc77849-6fmnk" Apr 17 16:39:22.715599 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:22.715462 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bf768\" (UniqueName: \"kubernetes.io/projected/267806f9-e950-4f00-80b9-35aa3861db64-kube-api-access-bf768\") pod \"odh-model-controller-696fc77849-6fmnk\" (UID: \"267806f9-e950-4f00-80b9-35aa3861db64\") " pod="kserve/odh-model-controller-696fc77849-6fmnk" Apr 17 16:39:22.715599 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:39:22.715559 2569 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 16:39:22.715678 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:39:22.715641 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/267806f9-e950-4f00-80b9-35aa3861db64-cert podName:267806f9-e950-4f00-80b9-35aa3861db64 nodeName:}" failed. No retries permitted until 2026-04-17 16:39:23.215620454 +0000 UTC m=+460.738646021 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/267806f9-e950-4f00-80b9-35aa3861db64-cert") pod "odh-model-controller-696fc77849-6fmnk" (UID: "267806f9-e950-4f00-80b9-35aa3861db64") : secret "odh-model-controller-webhook-cert" not found Apr 17 16:39:22.727781 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:22.727756 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf768\" (UniqueName: \"kubernetes.io/projected/267806f9-e950-4f00-80b9-35aa3861db64-kube-api-access-bf768\") pod \"odh-model-controller-696fc77849-6fmnk\" (UID: \"267806f9-e950-4f00-80b9-35aa3861db64\") " pod="kserve/odh-model-controller-696fc77849-6fmnk" Apr 17 16:39:23.220794 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:23.220742 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/267806f9-e950-4f00-80b9-35aa3861db64-cert\") pod \"odh-model-controller-696fc77849-6fmnk\" (UID: \"267806f9-e950-4f00-80b9-35aa3861db64\") " pod="kserve/odh-model-controller-696fc77849-6fmnk" Apr 17 16:39:23.223144 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:23.223115 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/267806f9-e950-4f00-80b9-35aa3861db64-cert\") pod \"odh-model-controller-696fc77849-6fmnk\" (UID: \"267806f9-e950-4f00-80b9-35aa3861db64\") " pod="kserve/odh-model-controller-696fc77849-6fmnk" Apr 17 16:39:23.393880 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:23.393837 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-6fmnk" Apr 17 16:39:23.514636 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:23.514554 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-6fmnk"] Apr 17 16:39:23.517752 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:39:23.517725 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod267806f9_e950_4f00_80b9_35aa3861db64.slice/crio-7a655f39b3090b7d82da30d6b7a80e030282786d0ba78129f3d61e620e600d53 WatchSource:0}: Error finding container 7a655f39b3090b7d82da30d6b7a80e030282786d0ba78129f3d61e620e600d53: Status 404 returned error can't find the container with id 7a655f39b3090b7d82da30d6b7a80e030282786d0ba78129f3d61e620e600d53 Apr 17 16:39:23.519471 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:23.519454 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:39:24.515874 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:24.515839 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-6fmnk" event={"ID":"267806f9-e950-4f00-80b9-35aa3861db64","Type":"ContainerStarted","Data":"7a655f39b3090b7d82da30d6b7a80e030282786d0ba78129f3d61e620e600d53"} Apr 17 16:39:26.524269 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:26.524207 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-6fmnk" event={"ID":"267806f9-e950-4f00-80b9-35aa3861db64","Type":"ContainerStarted","Data":"f09ebc0cf2bbba2a11fcd58110c4f649adaeddd8e484a033381dc875d245ea84"} Apr 17 16:39:26.524701 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:26.524294 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-6fmnk" Apr 17 16:39:26.541324 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:26.541272 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-6fmnk" podStartSLOduration=1.70509677 podStartE2EDuration="4.541246242s" podCreationTimestamp="2026-04-17 16:39:22 +0000 UTC" firstStartedPulling="2026-04-17 16:39:23.519576348 +0000 UTC m=+461.042601910" lastFinishedPulling="2026-04-17 16:39:26.355725816 +0000 UTC m=+463.878751382" observedRunningTime="2026-04-17 16:39:26.539486858 +0000 UTC m=+464.062512441" watchObservedRunningTime="2026-04-17 16:39:26.541246242 +0000 UTC m=+464.064271880" Apr 17 16:39:37.529760 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:37.529724 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-6fmnk" Apr 17 16:39:38.338099 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:38.338058 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-v5zkt"] Apr 17 16:39:38.341729 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:38.341712 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-v5zkt" Apr 17 16:39:38.344135 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:38.344112 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 17 16:39:38.344299 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:38.344111 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-bm2wd\"" Apr 17 16:39:38.349011 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:38.348984 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-v5zkt"] Apr 17 16:39:38.454566 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:38.454536 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wqtf\" (UniqueName: \"kubernetes.io/projected/7dbc4a4d-ea9d-420f-8973-64cf3b1cb9fb-kube-api-access-2wqtf\") pod \"s3-init-v5zkt\" (UID: \"7dbc4a4d-ea9d-420f-8973-64cf3b1cb9fb\") " pod="kserve/s3-init-v5zkt" Apr 17 16:39:38.555043 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:38.555009 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wqtf\" (UniqueName: \"kubernetes.io/projected/7dbc4a4d-ea9d-420f-8973-64cf3b1cb9fb-kube-api-access-2wqtf\") pod \"s3-init-v5zkt\" (UID: \"7dbc4a4d-ea9d-420f-8973-64cf3b1cb9fb\") " pod="kserve/s3-init-v5zkt" Apr 17 16:39:38.564743 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:38.564720 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wqtf\" (UniqueName: \"kubernetes.io/projected/7dbc4a4d-ea9d-420f-8973-64cf3b1cb9fb-kube-api-access-2wqtf\") pod \"s3-init-v5zkt\" (UID: \"7dbc4a4d-ea9d-420f-8973-64cf3b1cb9fb\") " pod="kserve/s3-init-v5zkt" Apr 17 16:39:38.659975 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:38.659943 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-v5zkt" Apr 17 16:39:38.785270 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:38.785220 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-v5zkt"] Apr 17 16:39:38.793227 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:39:38.793182 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dbc4a4d_ea9d_420f_8973_64cf3b1cb9fb.slice/crio-c2b58456310bf239597ecb6a54877e6a5e90e9b794ada055010a761bf1de722e WatchSource:0}: Error finding container c2b58456310bf239597ecb6a54877e6a5e90e9b794ada055010a761bf1de722e: Status 404 returned error can't find the container with id c2b58456310bf239597ecb6a54877e6a5e90e9b794ada055010a761bf1de722e Apr 17 16:39:39.562264 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:39.562217 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-v5zkt" event={"ID":"7dbc4a4d-ea9d-420f-8973-64cf3b1cb9fb","Type":"ContainerStarted","Data":"c2b58456310bf239597ecb6a54877e6a5e90e9b794ada055010a761bf1de722e"} Apr 17 16:39:43.384448 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:43.384420 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 17 16:39:43.578021 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:43.577984 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-v5zkt" event={"ID":"7dbc4a4d-ea9d-420f-8973-64cf3b1cb9fb","Type":"ContainerStarted","Data":"a498f3f5c234c8e1ab7125c2384f0613f40894d8cd87a828c8e150db6d16e02d"} Apr 17 16:39:43.599057 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:43.599008 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-v5zkt" podStartSLOduration=1.012002498 podStartE2EDuration="5.598993087s" podCreationTimestamp="2026-04-17 16:39:38 +0000 UTC" firstStartedPulling="2026-04-17 16:39:38.794956147 +0000 UTC m=+476.317981712" lastFinishedPulling="2026-04-17 16:39:43.38194674 +0000 UTC m=+480.904972301" observedRunningTime="2026-04-17 16:39:43.598721273 +0000 UTC m=+481.121746858" watchObservedRunningTime="2026-04-17 16:39:43.598993087 +0000 UTC m=+481.122018671" Apr 17 16:39:46.587755 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:46.587724 2569 generic.go:358] "Generic (PLEG): container finished" podID="7dbc4a4d-ea9d-420f-8973-64cf3b1cb9fb" containerID="a498f3f5c234c8e1ab7125c2384f0613f40894d8cd87a828c8e150db6d16e02d" exitCode=0 Apr 17 16:39:46.588147 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:46.587806 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-v5zkt" event={"ID":"7dbc4a4d-ea9d-420f-8973-64cf3b1cb9fb","Type":"ContainerDied","Data":"a498f3f5c234c8e1ab7125c2384f0613f40894d8cd87a828c8e150db6d16e02d"} Apr 17 16:39:47.719835 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:47.719808 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-v5zkt" Apr 17 16:39:47.737422 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:47.737387 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wqtf\" (UniqueName: \"kubernetes.io/projected/7dbc4a4d-ea9d-420f-8973-64cf3b1cb9fb-kube-api-access-2wqtf\") pod \"7dbc4a4d-ea9d-420f-8973-64cf3b1cb9fb\" (UID: \"7dbc4a4d-ea9d-420f-8973-64cf3b1cb9fb\") " Apr 17 16:39:47.739533 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:47.739505 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dbc4a4d-ea9d-420f-8973-64cf3b1cb9fb-kube-api-access-2wqtf" (OuterVolumeSpecName: "kube-api-access-2wqtf") pod "7dbc4a4d-ea9d-420f-8973-64cf3b1cb9fb" (UID: "7dbc4a4d-ea9d-420f-8973-64cf3b1cb9fb"). InnerVolumeSpecName "kube-api-access-2wqtf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:39:47.838556 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:47.838525 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2wqtf\" (UniqueName: \"kubernetes.io/projected/7dbc4a4d-ea9d-420f-8973-64cf3b1cb9fb-kube-api-access-2wqtf\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:39:48.594733 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:48.594704 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-v5zkt" Apr 17 16:39:48.594921 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:48.594703 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-v5zkt" event={"ID":"7dbc4a4d-ea9d-420f-8973-64cf3b1cb9fb","Type":"ContainerDied","Data":"c2b58456310bf239597ecb6a54877e6a5e90e9b794ada055010a761bf1de722e"} Apr 17 16:39:48.594921 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:48.594823 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2b58456310bf239597ecb6a54877e6a5e90e9b794ada055010a761bf1de722e" Apr 17 16:39:58.284476 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:58.284427 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2"] Apr 17 16:39:58.284971 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:58.284848 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7dbc4a4d-ea9d-420f-8973-64cf3b1cb9fb" containerName="s3-init" Apr 17 16:39:58.284971 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:58.284866 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dbc4a4d-ea9d-420f-8973-64cf3b1cb9fb" containerName="s3-init" Apr 17 16:39:58.284971 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:58.284948 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="7dbc4a4d-ea9d-420f-8973-64cf3b1cb9fb" containerName="s3-init" Apr 17 16:39:58.288312 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:58.288288 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" Apr 17 16:39:58.290983 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:58.290960 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-2a85d-kube-rbac-proxy-sar-config\"" Apr 17 16:39:58.291125 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:58.291087 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-2a85d-predictor-serving-cert\"" Apr 17 16:39:58.291223 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:58.291125 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 16:39:58.291223 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:58.291208 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 16:39:58.292364 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:58.292323 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-7cr77\"" Apr 17 16:39:58.304724 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:58.304695 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2"] Apr 17 16:39:58.323199 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:58.323160 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6947aac7-9c49-4c08-9cc4-060fa22367f2-proxy-tls\") pod \"isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2\" (UID: \"6947aac7-9c49-4c08-9cc4-060fa22367f2\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" Apr 17 16:39:58.323368 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:58.323278 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-raw-sklearn-batcher-2a85d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6947aac7-9c49-4c08-9cc4-060fa22367f2-isvc-raw-sklearn-batcher-2a85d-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2\" (UID: \"6947aac7-9c49-4c08-9cc4-060fa22367f2\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" Apr 17 16:39:58.323368 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:58.323314 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6947aac7-9c49-4c08-9cc4-060fa22367f2-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2\" (UID: \"6947aac7-9c49-4c08-9cc4-060fa22367f2\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" Apr 17 16:39:58.323368 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:58.323347 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27rg6\" (UniqueName: \"kubernetes.io/projected/6947aac7-9c49-4c08-9cc4-060fa22367f2-kube-api-access-27rg6\") pod \"isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2\" (UID: \"6947aac7-9c49-4c08-9cc4-060fa22367f2\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" Apr 17 16:39:58.424468 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:58.424437 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6947aac7-9c49-4c08-9cc4-060fa22367f2-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2\" (UID: \"6947aac7-9c49-4c08-9cc4-060fa22367f2\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" Apr 17 16:39:58.424657 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:58.424476 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27rg6\" (UniqueName: \"kubernetes.io/projected/6947aac7-9c49-4c08-9cc4-060fa22367f2-kube-api-access-27rg6\") pod \"isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2\" (UID: \"6947aac7-9c49-4c08-9cc4-060fa22367f2\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" Apr 17 16:39:58.424657 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:58.424526 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6947aac7-9c49-4c08-9cc4-060fa22367f2-proxy-tls\") pod \"isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2\" (UID: \"6947aac7-9c49-4c08-9cc4-060fa22367f2\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" Apr 17 16:39:58.424657 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:58.424572 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-raw-sklearn-batcher-2a85d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6947aac7-9c49-4c08-9cc4-060fa22367f2-isvc-raw-sklearn-batcher-2a85d-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2\" (UID: \"6947aac7-9c49-4c08-9cc4-060fa22367f2\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" Apr 17 16:39:58.424908 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:58.424881 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6947aac7-9c49-4c08-9cc4-060fa22367f2-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2\" (UID: \"6947aac7-9c49-4c08-9cc4-060fa22367f2\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" Apr 17 16:39:58.425186 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:58.425163 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-raw-sklearn-batcher-2a85d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6947aac7-9c49-4c08-9cc4-060fa22367f2-isvc-raw-sklearn-batcher-2a85d-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2\" (UID: \"6947aac7-9c49-4c08-9cc4-060fa22367f2\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" Apr 17 16:39:58.427135 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:58.427112 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6947aac7-9c49-4c08-9cc4-060fa22367f2-proxy-tls\") pod \"isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2\" (UID: \"6947aac7-9c49-4c08-9cc4-060fa22367f2\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" Apr 17 16:39:58.433988 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:58.433964 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27rg6\" (UniqueName: \"kubernetes.io/projected/6947aac7-9c49-4c08-9cc4-060fa22367f2-kube-api-access-27rg6\") pod \"isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2\" (UID: \"6947aac7-9c49-4c08-9cc4-060fa22367f2\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" Apr 17 16:39:58.602520 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:58.602421 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" Apr 17 16:39:58.730079 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:58.730054 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2"] Apr 17 16:39:58.732700 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:39:58.732671 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6947aac7_9c49_4c08_9cc4_060fa22367f2.slice/crio-f3a6fae2c48d63cec586b16f702ab86b0375d652e5cafe1e07da0e993b7a8af8 WatchSource:0}: Error finding container f3a6fae2c48d63cec586b16f702ab86b0375d652e5cafe1e07da0e993b7a8af8: Status 404 returned error can't find the container with id f3a6fae2c48d63cec586b16f702ab86b0375d652e5cafe1e07da0e993b7a8af8 Apr 17 16:39:59.626892 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:39:59.626839 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" event={"ID":"6947aac7-9c49-4c08-9cc4-060fa22367f2","Type":"ContainerStarted","Data":"f3a6fae2c48d63cec586b16f702ab86b0375d652e5cafe1e07da0e993b7a8af8"} Apr 17 16:40:03.642284 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:40:03.642225 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" event={"ID":"6947aac7-9c49-4c08-9cc4-060fa22367f2","Type":"ContainerStarted","Data":"25f31174314a09cafe345bc5fd8e0ea3a2852b01173ea626bfe874efd0c0a945"} Apr 17 16:40:06.652266 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:40:06.652221 2569 generic.go:358] "Generic (PLEG): container finished" podID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerID="25f31174314a09cafe345bc5fd8e0ea3a2852b01173ea626bfe874efd0c0a945" exitCode=0 Apr 17 16:40:06.652714 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:40:06.652299 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" event={"ID":"6947aac7-9c49-4c08-9cc4-060fa22367f2","Type":"ContainerDied","Data":"25f31174314a09cafe345bc5fd8e0ea3a2852b01173ea626bfe874efd0c0a945"} Apr 17 16:40:20.707365 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:40:20.707323 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" event={"ID":"6947aac7-9c49-4c08-9cc4-060fa22367f2","Type":"ContainerStarted","Data":"f327ed6186af32506a4b92d746375c6b2e2de55ce295b9c2ad2cb03f8a70c4c1"} Apr 17 16:40:22.715013 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:40:22.714973 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" event={"ID":"6947aac7-9c49-4c08-9cc4-060fa22367f2","Type":"ContainerStarted","Data":"9154eca0ae4dc224b200507847649b9ce1596bd8496c6377dcc375884fdd5691"} Apr 17 16:40:25.731058 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:40:25.731022 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" event={"ID":"6947aac7-9c49-4c08-9cc4-060fa22367f2","Type":"ContainerStarted","Data":"b58ec35f5ba5b635278cd07d2200bd9fbe33ba96d9ec1307f81cf491e04a84d7"} Apr 17 16:40:25.731548 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:40:25.731359 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" Apr 17 16:40:25.731548 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:40:25.731492 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" Apr 17 16:40:25.732769 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:40:25.732724 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 17 16:40:25.751087 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:40:25.751038 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podStartSLOduration=1.320744764 podStartE2EDuration="27.751026919s" podCreationTimestamp="2026-04-17 16:39:58 +0000 UTC" firstStartedPulling="2026-04-17 16:39:58.734625103 +0000 UTC m=+496.257650669" lastFinishedPulling="2026-04-17 16:40:25.164907257 +0000 UTC m=+522.687932824" observedRunningTime="2026-04-17 16:40:25.749548773 +0000 UTC m=+523.272574369" watchObservedRunningTime="2026-04-17 16:40:25.751026919 +0000 UTC m=+523.274052504" Apr 17 16:40:26.733988 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:40:26.733909 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" Apr 17 16:40:26.733988 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:40:26.733943 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 17 16:40:26.734899 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:40:26.734878 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:40:27.738092 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:40:27.738047 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 17 16:40:27.739421 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:40:27.739375 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:40:27.742901 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:40:27.742882 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" Apr 17 16:40:28.741228 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:40:28.741189 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 17 16:40:28.741630 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:40:28.741531 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:40:38.741588 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:40:38.741531 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 17 16:40:38.741978 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:40:38.741930 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:40:48.742067 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:40:48.742020 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 17 16:40:48.742558 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:40:48.742533 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:40:58.742103 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:40:58.742050 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 17 16:40:58.742688 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:40:58.742452 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:41:08.742042 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:08.741990 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 17 16:41:08.742488 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:08.742434 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:41:18.741477 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:18.741422 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 17 16:41:18.741898 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:18.741867 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:41:28.741464 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:28.741426 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" Apr 17 16:41:28.741895 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:28.741683 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" Apr 17 16:41:43.335893 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.335858 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2"] Apr 17 16:41:43.336420 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.336340 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="kserve-container" containerID="cri-o://f327ed6186af32506a4b92d746375c6b2e2de55ce295b9c2ad2cb03f8a70c4c1" gracePeriod=30 Apr 17 16:41:43.336420 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.336361 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="agent" containerID="cri-o://b58ec35f5ba5b635278cd07d2200bd9fbe33ba96d9ec1307f81cf491e04a84d7" gracePeriod=30 Apr 17 16:41:43.336548 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.336362 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="kube-rbac-proxy" containerID="cri-o://9154eca0ae4dc224b200507847649b9ce1596bd8496c6377dcc375884fdd5691" gracePeriod=30 Apr 17 16:41:43.471996 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.471968 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k"] Apr 17 16:41:43.474460 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.474444 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" Apr 17 16:41:43.476992 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.476971 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-f88b3-kube-rbac-proxy-sar-config\"" Apr 17 16:41:43.477095 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.477067 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-f88b3-predictor-serving-cert\"" Apr 17 16:41:43.487518 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.487495 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k"] Apr 17 16:41:43.516322 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.516294 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6n4n\" (UniqueName: \"kubernetes.io/projected/7c3b027b-aaf4-4013-b294-9634cef66a61-kube-api-access-c6n4n\") pod \"isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k\" (UID: \"7c3b027b-aaf4-4013-b294-9634cef66a61\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" Apr 17 16:41:43.516459 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.516332 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c3b027b-aaf4-4013-b294-9634cef66a61-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k\" (UID: \"7c3b027b-aaf4-4013-b294-9634cef66a61\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" Apr 17 16:41:43.516459 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.516379 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c3b027b-aaf4-4013-b294-9634cef66a61-proxy-tls\") pod \"isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k\" (UID: \"7c3b027b-aaf4-4013-b294-9634cef66a61\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" Apr 17 16:41:43.516459 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.516398 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-f88b3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7c3b027b-aaf4-4013-b294-9634cef66a61-isvc-sklearn-graph-raw-f88b3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k\" (UID: \"7c3b027b-aaf4-4013-b294-9634cef66a61\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" Apr 17 16:41:43.565363 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.565328 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j"] Apr 17 16:41:43.568017 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.567999 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" Apr 17 16:41:43.570628 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.570604 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-f88b3-kube-rbac-proxy-sar-config\"" Apr 17 16:41:43.570723 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.570699 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-f88b3-predictor-serving-cert\"" Apr 17 16:41:43.578639 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.578618 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j"] Apr 17 16:41:43.617010 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.616982 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/472fed14-7cb5-47c1-8f39-8fbe93f72718-proxy-tls\") pod \"isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j\" (UID: \"472fed14-7cb5-47c1-8f39-8fbe93f72718\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" Apr 17 16:41:43.617150 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.617027 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/472fed14-7cb5-47c1-8f39-8fbe93f72718-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j\" (UID: \"472fed14-7cb5-47c1-8f39-8fbe93f72718\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" Apr 17 16:41:43.617150 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.617050 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6n4n\" (UniqueName: \"kubernetes.io/projected/7c3b027b-aaf4-4013-b294-9634cef66a61-kube-api-access-c6n4n\") pod \"isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k\" (UID: \"7c3b027b-aaf4-4013-b294-9634cef66a61\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" Apr 17 16:41:43.617150 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.617120 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c3b027b-aaf4-4013-b294-9634cef66a61-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k\" (UID: \"7c3b027b-aaf4-4013-b294-9634cef66a61\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" Apr 17 16:41:43.617364 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.617183 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c3b027b-aaf4-4013-b294-9634cef66a61-proxy-tls\") pod \"isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k\" (UID: \"7c3b027b-aaf4-4013-b294-9634cef66a61\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" Apr 17 16:41:43.617364 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.617207 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-f88b3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7c3b027b-aaf4-4013-b294-9634cef66a61-isvc-sklearn-graph-raw-f88b3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k\" (UID: \"7c3b027b-aaf4-4013-b294-9634cef66a61\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" Apr 17 16:41:43.617364 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.617232 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv7vr\" (UniqueName: \"kubernetes.io/projected/472fed14-7cb5-47c1-8f39-8fbe93f72718-kube-api-access-qv7vr\") pod \"isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j\" (UID: \"472fed14-7cb5-47c1-8f39-8fbe93f72718\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" Apr 17 16:41:43.617364 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.617286 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-f88b3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/472fed14-7cb5-47c1-8f39-8fbe93f72718-isvc-xgboost-graph-raw-f88b3-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j\" (UID: \"472fed14-7cb5-47c1-8f39-8fbe93f72718\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" Apr 17 16:41:43.617533 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.617498 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c3b027b-aaf4-4013-b294-9634cef66a61-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k\" (UID: \"7c3b027b-aaf4-4013-b294-9634cef66a61\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" Apr 17 16:41:43.617858 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.617841 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-f88b3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7c3b027b-aaf4-4013-b294-9634cef66a61-isvc-sklearn-graph-raw-f88b3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k\" (UID: \"7c3b027b-aaf4-4013-b294-9634cef66a61\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" Apr 17 16:41:43.619654 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.619632 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c3b027b-aaf4-4013-b294-9634cef66a61-proxy-tls\") pod \"isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k\" (UID: \"7c3b027b-aaf4-4013-b294-9634cef66a61\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" Apr 17 16:41:43.625011 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.624987 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6n4n\" (UniqueName: \"kubernetes.io/projected/7c3b027b-aaf4-4013-b294-9634cef66a61-kube-api-access-c6n4n\") pod \"isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k\" (UID: \"7c3b027b-aaf4-4013-b294-9634cef66a61\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" Apr 17 16:41:43.717697 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.717658 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qv7vr\" (UniqueName: \"kubernetes.io/projected/472fed14-7cb5-47c1-8f39-8fbe93f72718-kube-api-access-qv7vr\") pod \"isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j\" (UID: \"472fed14-7cb5-47c1-8f39-8fbe93f72718\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" Apr 17 16:41:43.717697 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.717704 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-f88b3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/472fed14-7cb5-47c1-8f39-8fbe93f72718-isvc-xgboost-graph-raw-f88b3-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j\" (UID: \"472fed14-7cb5-47c1-8f39-8fbe93f72718\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" Apr 17 16:41:43.717940 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.717731 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/472fed14-7cb5-47c1-8f39-8fbe93f72718-proxy-tls\") pod \"isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j\" (UID: \"472fed14-7cb5-47c1-8f39-8fbe93f72718\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" Apr 17 16:41:43.717940 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.717763 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/472fed14-7cb5-47c1-8f39-8fbe93f72718-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j\" (UID: \"472fed14-7cb5-47c1-8f39-8fbe93f72718\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" Apr 17 16:41:43.718150 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.718134 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/472fed14-7cb5-47c1-8f39-8fbe93f72718-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j\" (UID: \"472fed14-7cb5-47c1-8f39-8fbe93f72718\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" Apr 17 16:41:43.718439 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.718415 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-f88b3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/472fed14-7cb5-47c1-8f39-8fbe93f72718-isvc-xgboost-graph-raw-f88b3-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j\" (UID: \"472fed14-7cb5-47c1-8f39-8fbe93f72718\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" Apr 17 16:41:43.720346 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.720324 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/472fed14-7cb5-47c1-8f39-8fbe93f72718-proxy-tls\") pod \"isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j\" (UID: \"472fed14-7cb5-47c1-8f39-8fbe93f72718\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" Apr 17 16:41:43.726386 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.726361 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv7vr\" (UniqueName: \"kubernetes.io/projected/472fed14-7cb5-47c1-8f39-8fbe93f72718-kube-api-access-qv7vr\") pod \"isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j\" (UID: \"472fed14-7cb5-47c1-8f39-8fbe93f72718\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" Apr 17 16:41:43.784691 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.784665 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" Apr 17 16:41:43.880868 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.880792 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" Apr 17 16:41:43.904163 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.904069 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k"] Apr 17 16:41:43.906520 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:41:43.906491 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c3b027b_aaf4_4013_b294_9634cef66a61.slice/crio-e4b11df23d23e05c80608eb0286ce3857f5b540ec34b7ba0efa8a2700f47fc35 WatchSource:0}: Error finding container e4b11df23d23e05c80608eb0286ce3857f5b540ec34b7ba0efa8a2700f47fc35: Status 404 returned error can't find the container with id e4b11df23d23e05c80608eb0286ce3857f5b540ec34b7ba0efa8a2700f47fc35 Apr 17 16:41:43.959594 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.959565 2569 generic.go:358] "Generic (PLEG): container finished" podID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerID="9154eca0ae4dc224b200507847649b9ce1596bd8496c6377dcc375884fdd5691" exitCode=2 Apr 17 16:41:43.959727 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.959641 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" event={"ID":"6947aac7-9c49-4c08-9cc4-060fa22367f2","Type":"ContainerDied","Data":"9154eca0ae4dc224b200507847649b9ce1596bd8496c6377dcc375884fdd5691"} Apr 17 16:41:43.960770 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:43.960737 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" event={"ID":"7c3b027b-aaf4-4013-b294-9634cef66a61","Type":"ContainerStarted","Data":"e4b11df23d23e05c80608eb0286ce3857f5b540ec34b7ba0efa8a2700f47fc35"} Apr 17 16:41:44.012019 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:44.011990 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j"] Apr 17 16:41:44.014397 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:41:44.014371 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod472fed14_7cb5_47c1_8f39_8fbe93f72718.slice/crio-c8fc626c9853d19974bfad228f810bb52230c4287a133ea256ae6bab14927edf WatchSource:0}: Error finding container c8fc626c9853d19974bfad228f810bb52230c4287a133ea256ae6bab14927edf: Status 404 returned error can't find the container with id c8fc626c9853d19974bfad228f810bb52230c4287a133ea256ae6bab14927edf Apr 17 16:41:44.964942 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:44.964905 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" event={"ID":"7c3b027b-aaf4-4013-b294-9634cef66a61","Type":"ContainerStarted","Data":"7fa8ba14cad7e2d78a3464454d751eeeed55b995d8713bd74319ccc2cfc0d26c"} Apr 17 16:41:44.966347 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:44.966324 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" event={"ID":"472fed14-7cb5-47c1-8f39-8fbe93f72718","Type":"ContainerStarted","Data":"05c27fe9260b23ed2d293d047e53f37fb9ab06e69635e947eac6b3967026f56b"} Apr 17 16:41:44.966347 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:44.966351 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" event={"ID":"472fed14-7cb5-47c1-8f39-8fbe93f72718","Type":"ContainerStarted","Data":"c8fc626c9853d19974bfad228f810bb52230c4287a133ea256ae6bab14927edf"} Apr 17 16:41:47.738456 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:47.738420 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.20:8643/healthz\": dial tcp 10.134.0.20:8643: connect: connection refused" Apr 17 16:41:47.977120 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:47.977017 2569 generic.go:358] "Generic (PLEG): container finished" podID="7c3b027b-aaf4-4013-b294-9634cef66a61" containerID="7fa8ba14cad7e2d78a3464454d751eeeed55b995d8713bd74319ccc2cfc0d26c" exitCode=0 Apr 17 16:41:47.977120 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:47.977098 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" event={"ID":"7c3b027b-aaf4-4013-b294-9634cef66a61","Type":"ContainerDied","Data":"7fa8ba14cad7e2d78a3464454d751eeeed55b995d8713bd74319ccc2cfc0d26c"} Apr 17 16:41:47.978470 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:47.978448 2569 generic.go:358] "Generic (PLEG): container finished" podID="472fed14-7cb5-47c1-8f39-8fbe93f72718" containerID="05c27fe9260b23ed2d293d047e53f37fb9ab06e69635e947eac6b3967026f56b" exitCode=0 Apr 17 16:41:47.978584 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:47.978563 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" event={"ID":"472fed14-7cb5-47c1-8f39-8fbe93f72718","Type":"ContainerDied","Data":"05c27fe9260b23ed2d293d047e53f37fb9ab06e69635e947eac6b3967026f56b"} Apr 17 16:41:47.980721 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:47.980691 2569 generic.go:358] "Generic (PLEG): container finished" podID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerID="f327ed6186af32506a4b92d746375c6b2e2de55ce295b9c2ad2cb03f8a70c4c1" exitCode=0 Apr 17 16:41:47.980810 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:47.980732 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" event={"ID":"6947aac7-9c49-4c08-9cc4-060fa22367f2","Type":"ContainerDied","Data":"f327ed6186af32506a4b92d746375c6b2e2de55ce295b9c2ad2cb03f8a70c4c1"} Apr 17 16:41:48.741564 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:48.741514 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 17 16:41:48.742210 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:48.741982 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:41:48.986887 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:48.986845 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" event={"ID":"7c3b027b-aaf4-4013-b294-9634cef66a61","Type":"ContainerStarted","Data":"8237805f5fa13660fced6f433217e62a2e0f8fc394aee9aac1503ae48025bdc5"} Apr 17 16:41:48.986887 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:48.986894 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" event={"ID":"7c3b027b-aaf4-4013-b294-9634cef66a61","Type":"ContainerStarted","Data":"fb4e98363c885fa93f5e64ae016b2eff97d977e3be25f9aecc252760a2fe00c8"} Apr 17 16:41:48.987399 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:48.987290 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" Apr 17 16:41:48.987399 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:48.987329 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" Apr 17 16:41:48.988518 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:48.988268 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" podUID="7c3b027b-aaf4-4013-b294-9634cef66a61" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 17 16:41:49.006989 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:49.006902 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" podStartSLOduration=6.006886227 podStartE2EDuration="6.006886227s" podCreationTimestamp="2026-04-17 16:41:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:41:49.006312188 +0000 UTC m=+606.529337771" watchObservedRunningTime="2026-04-17 16:41:49.006886227 +0000 UTC m=+606.529911811" Apr 17 16:41:49.991126 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:49.991066 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" podUID="7c3b027b-aaf4-4013-b294-9634cef66a61" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 17 16:41:52.738918 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:52.738870 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.20:8643/healthz\": dial tcp 10.134.0.20:8643: connect: connection refused" Apr 17 16:41:54.995683 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:54.995648 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" Apr 17 16:41:54.996339 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:54.996300 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" podUID="7c3b027b-aaf4-4013-b294-9634cef66a61" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 17 16:41:57.739118 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:57.739035 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.20:8643/healthz\": dial tcp 10.134.0.20:8643: connect: connection refused" Apr 17 16:41:57.739574 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:57.739152 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" Apr 17 16:41:58.741896 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:58.741844 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 17 16:41:58.742384 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:41:58.742299 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:42:02.738497 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:02.738438 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.20:8643/healthz\": dial tcp 10.134.0.20:8643: connect: connection refused" Apr 17 16:42:04.996766 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:04.996719 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" podUID="7c3b027b-aaf4-4013-b294-9634cef66a61" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 17 16:42:07.738862 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:07.738771 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.20:8643/healthz\": dial tcp 10.134.0.20:8643: connect: connection refused" Apr 17 16:42:08.053157 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:08.053071 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" event={"ID":"472fed14-7cb5-47c1-8f39-8fbe93f72718","Type":"ContainerStarted","Data":"60de02db1ead812e14057a7d3ac22a8a5fdfd85c8cb25993c44023e0009bd1a3"} Apr 17 16:42:08.053157 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:08.053111 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" event={"ID":"472fed14-7cb5-47c1-8f39-8fbe93f72718","Type":"ContainerStarted","Data":"7f0a48ddf78873f63731a016b11faa31bacd53564b2496f8adccbcbd131290d2"} Apr 17 16:42:08.053415 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:08.053396 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" Apr 17 16:42:08.053537 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:08.053520 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" Apr 17 16:42:08.054525 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:08.054498 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" podUID="472fed14-7cb5-47c1-8f39-8fbe93f72718" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 17 16:42:08.071140 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:08.071088 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" podStartSLOduration=5.639996304 podStartE2EDuration="25.071071718s" podCreationTimestamp="2026-04-17 16:41:43 +0000 UTC" firstStartedPulling="2026-04-17 16:41:47.979924016 +0000 UTC m=+605.502949586" lastFinishedPulling="2026-04-17 16:42:07.410999438 +0000 UTC m=+624.934025000" observedRunningTime="2026-04-17 16:42:08.070574133 +0000 UTC m=+625.593599716" watchObservedRunningTime="2026-04-17 16:42:08.071071718 +0000 UTC m=+625.594097326" Apr 17 16:42:08.741598 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:08.741552 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 17 16:42:08.742054 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:08.741710 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" Apr 17 16:42:08.742054 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:08.741907 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:42:08.742054 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:08.741997 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" Apr 17 16:42:09.056165 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:09.056079 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" podUID="472fed14-7cb5-47c1-8f39-8fbe93f72718" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 17 16:42:12.738791 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:12.738747 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.20:8643/healthz\": dial tcp 10.134.0.20:8643: connect: connection refused" Apr 17 16:42:13.992374 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:13.992343 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" Apr 17 16:42:13.997403 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:13.997381 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6947aac7-9c49-4c08-9cc4-060fa22367f2-kserve-provision-location\") pod \"6947aac7-9c49-4c08-9cc4-060fa22367f2\" (UID: \"6947aac7-9c49-4c08-9cc4-060fa22367f2\") " Apr 17 16:42:13.997519 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:13.997418 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-raw-sklearn-batcher-2a85d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6947aac7-9c49-4c08-9cc4-060fa22367f2-isvc-raw-sklearn-batcher-2a85d-kube-rbac-proxy-sar-config\") pod \"6947aac7-9c49-4c08-9cc4-060fa22367f2\" (UID: \"6947aac7-9c49-4c08-9cc4-060fa22367f2\") " Apr 17 16:42:13.997519 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:13.997442 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27rg6\" (UniqueName: \"kubernetes.io/projected/6947aac7-9c49-4c08-9cc4-060fa22367f2-kube-api-access-27rg6\") pod \"6947aac7-9c49-4c08-9cc4-060fa22367f2\" (UID: \"6947aac7-9c49-4c08-9cc4-060fa22367f2\") " Apr 17 16:42:13.997519 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:13.997462 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6947aac7-9c49-4c08-9cc4-060fa22367f2-proxy-tls\") pod \"6947aac7-9c49-4c08-9cc4-060fa22367f2\" (UID: \"6947aac7-9c49-4c08-9cc4-060fa22367f2\") " Apr 17 16:42:13.997758 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:13.997732 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6947aac7-9c49-4c08-9cc4-060fa22367f2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6947aac7-9c49-4c08-9cc4-060fa22367f2" (UID: "6947aac7-9c49-4c08-9cc4-060fa22367f2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:42:13.997817 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:13.997755 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6947aac7-9c49-4c08-9cc4-060fa22367f2-isvc-raw-sklearn-batcher-2a85d-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-raw-sklearn-batcher-2a85d-kube-rbac-proxy-sar-config") pod "6947aac7-9c49-4c08-9cc4-060fa22367f2" (UID: "6947aac7-9c49-4c08-9cc4-060fa22367f2"). InnerVolumeSpecName "isvc-raw-sklearn-batcher-2a85d-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:42:13.999770 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:13.999742 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6947aac7-9c49-4c08-9cc4-060fa22367f2-kube-api-access-27rg6" (OuterVolumeSpecName: "kube-api-access-27rg6") pod "6947aac7-9c49-4c08-9cc4-060fa22367f2" (UID: "6947aac7-9c49-4c08-9cc4-060fa22367f2"). InnerVolumeSpecName "kube-api-access-27rg6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:42:13.999841 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:13.999751 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6947aac7-9c49-4c08-9cc4-060fa22367f2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6947aac7-9c49-4c08-9cc4-060fa22367f2" (UID: "6947aac7-9c49-4c08-9cc4-060fa22367f2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:42:14.061203 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:14.061171 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" Apr 17 16:42:14.061886 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:14.061851 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" podUID="472fed14-7cb5-47c1-8f39-8fbe93f72718" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 17 16:42:14.071295 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:14.071268 2569 generic.go:358] "Generic (PLEG): container finished" podID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerID="b58ec35f5ba5b635278cd07d2200bd9fbe33ba96d9ec1307f81cf491e04a84d7" exitCode=0 Apr 17 16:42:14.071432 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:14.071302 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" event={"ID":"6947aac7-9c49-4c08-9cc4-060fa22367f2","Type":"ContainerDied","Data":"b58ec35f5ba5b635278cd07d2200bd9fbe33ba96d9ec1307f81cf491e04a84d7"} Apr 17 16:42:14.071432 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:14.071324 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" event={"ID":"6947aac7-9c49-4c08-9cc4-060fa22367f2","Type":"ContainerDied","Data":"f3a6fae2c48d63cec586b16f702ab86b0375d652e5cafe1e07da0e993b7a8af8"} Apr 17 16:42:14.071432 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:14.071339 2569 scope.go:117] "RemoveContainer" containerID="b58ec35f5ba5b635278cd07d2200bd9fbe33ba96d9ec1307f81cf491e04a84d7" Apr 17 16:42:14.071432 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:14.071361 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2" Apr 17 16:42:14.080489 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:14.080464 2569 scope.go:117] "RemoveContainer" containerID="9154eca0ae4dc224b200507847649b9ce1596bd8496c6377dcc375884fdd5691" Apr 17 16:42:14.087906 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:14.087887 2569 scope.go:117] "RemoveContainer" containerID="f327ed6186af32506a4b92d746375c6b2e2de55ce295b9c2ad2cb03f8a70c4c1" Apr 17 16:42:14.097384 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:14.096763 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2"] Apr 17 16:42:14.097478 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:14.097430 2569 scope.go:117] "RemoveContainer" containerID="25f31174314a09cafe345bc5fd8e0ea3a2852b01173ea626bfe874efd0c0a945" Apr 17 16:42:14.098042 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:14.098023 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6947aac7-9c49-4c08-9cc4-060fa22367f2-kserve-provision-location\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:42:14.098118 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:14.098049 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-raw-sklearn-batcher-2a85d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6947aac7-9c49-4c08-9cc4-060fa22367f2-isvc-raw-sklearn-batcher-2a85d-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:42:14.098118 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:14.098065 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-27rg6\" (UniqueName: \"kubernetes.io/projected/6947aac7-9c49-4c08-9cc4-060fa22367f2-kube-api-access-27rg6\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:42:14.098118 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:14.098079 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6947aac7-9c49-4c08-9cc4-060fa22367f2-proxy-tls\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:42:14.101048 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:14.101028 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2a85d-predictor-5dfd9b949-dzdr2"] Apr 17 16:42:14.104963 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:14.104947 2569 scope.go:117] "RemoveContainer" containerID="b58ec35f5ba5b635278cd07d2200bd9fbe33ba96d9ec1307f81cf491e04a84d7" Apr 17 16:42:14.105196 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:42:14.105178 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b58ec35f5ba5b635278cd07d2200bd9fbe33ba96d9ec1307f81cf491e04a84d7\": container with ID starting with b58ec35f5ba5b635278cd07d2200bd9fbe33ba96d9ec1307f81cf491e04a84d7 not found: ID does not exist" containerID="b58ec35f5ba5b635278cd07d2200bd9fbe33ba96d9ec1307f81cf491e04a84d7" Apr 17 16:42:14.105237 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:14.105205 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b58ec35f5ba5b635278cd07d2200bd9fbe33ba96d9ec1307f81cf491e04a84d7"} err="failed to get container status \"b58ec35f5ba5b635278cd07d2200bd9fbe33ba96d9ec1307f81cf491e04a84d7\": rpc error: code = NotFound desc = could not find container \"b58ec35f5ba5b635278cd07d2200bd9fbe33ba96d9ec1307f81cf491e04a84d7\": container with ID starting with b58ec35f5ba5b635278cd07d2200bd9fbe33ba96d9ec1307f81cf491e04a84d7 not found: ID does not exist" Apr 17 16:42:14.105237 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:14.105223 2569 scope.go:117] "RemoveContainer" containerID="9154eca0ae4dc224b200507847649b9ce1596bd8496c6377dcc375884fdd5691" Apr 17 16:42:14.105460 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:42:14.105445 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9154eca0ae4dc224b200507847649b9ce1596bd8496c6377dcc375884fdd5691\": container with ID starting with 9154eca0ae4dc224b200507847649b9ce1596bd8496c6377dcc375884fdd5691 not found: ID does not exist" containerID="9154eca0ae4dc224b200507847649b9ce1596bd8496c6377dcc375884fdd5691" Apr 17 16:42:14.105518 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:14.105464 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9154eca0ae4dc224b200507847649b9ce1596bd8496c6377dcc375884fdd5691"} err="failed to get container status \"9154eca0ae4dc224b200507847649b9ce1596bd8496c6377dcc375884fdd5691\": rpc error: code = NotFound desc = could not find container \"9154eca0ae4dc224b200507847649b9ce1596bd8496c6377dcc375884fdd5691\": container with ID starting with 9154eca0ae4dc224b200507847649b9ce1596bd8496c6377dcc375884fdd5691 not found: ID does not exist" Apr 17 16:42:14.105518 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:14.105478 2569 scope.go:117] "RemoveContainer" containerID="f327ed6186af32506a4b92d746375c6b2e2de55ce295b9c2ad2cb03f8a70c4c1" Apr 17 16:42:14.105698 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:42:14.105679 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f327ed6186af32506a4b92d746375c6b2e2de55ce295b9c2ad2cb03f8a70c4c1\": container with ID starting with f327ed6186af32506a4b92d746375c6b2e2de55ce295b9c2ad2cb03f8a70c4c1 not found: ID does not exist" containerID="f327ed6186af32506a4b92d746375c6b2e2de55ce295b9c2ad2cb03f8a70c4c1" Apr 17 16:42:14.105775 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:14.105708 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f327ed6186af32506a4b92d746375c6b2e2de55ce295b9c2ad2cb03f8a70c4c1"} err="failed to get container status \"f327ed6186af32506a4b92d746375c6b2e2de55ce295b9c2ad2cb03f8a70c4c1\": rpc error: code = NotFound desc = could not find container \"f327ed6186af32506a4b92d746375c6b2e2de55ce295b9c2ad2cb03f8a70c4c1\": container with ID starting with f327ed6186af32506a4b92d746375c6b2e2de55ce295b9c2ad2cb03f8a70c4c1 not found: ID does not exist" Apr 17 16:42:14.105775 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:14.105730 2569 scope.go:117] "RemoveContainer" containerID="25f31174314a09cafe345bc5fd8e0ea3a2852b01173ea626bfe874efd0c0a945" Apr 17 16:42:14.106077 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:42:14.106059 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25f31174314a09cafe345bc5fd8e0ea3a2852b01173ea626bfe874efd0c0a945\": container with ID starting with 25f31174314a09cafe345bc5fd8e0ea3a2852b01173ea626bfe874efd0c0a945 not found: ID does not exist" containerID="25f31174314a09cafe345bc5fd8e0ea3a2852b01173ea626bfe874efd0c0a945" Apr 17 16:42:14.106114 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:14.106081 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25f31174314a09cafe345bc5fd8e0ea3a2852b01173ea626bfe874efd0c0a945"} err="failed to get container status \"25f31174314a09cafe345bc5fd8e0ea3a2852b01173ea626bfe874efd0c0a945\": rpc error: code = NotFound desc = could not find container \"25f31174314a09cafe345bc5fd8e0ea3a2852b01173ea626bfe874efd0c0a945\": container with ID starting with 25f31174314a09cafe345bc5fd8e0ea3a2852b01173ea626bfe874efd0c0a945 not found: ID does not exist" Apr 17 16:42:14.996582 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:14.996540 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" podUID="7c3b027b-aaf4-4013-b294-9634cef66a61" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 17 16:42:15.040699 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:15.040667 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" path="/var/lib/kubelet/pods/6947aac7-9c49-4c08-9cc4-060fa22367f2/volumes" Apr 17 16:42:24.062315 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:24.062276 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" podUID="472fed14-7cb5-47c1-8f39-8fbe93f72718" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 17 16:42:24.996412 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:24.996373 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" podUID="7c3b027b-aaf4-4013-b294-9634cef66a61" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 17 16:42:34.062108 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:34.062064 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" podUID="472fed14-7cb5-47c1-8f39-8fbe93f72718" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 17 16:42:34.997113 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:34.997075 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" podUID="7c3b027b-aaf4-4013-b294-9634cef66a61" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 17 16:42:44.061720 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:44.061677 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" podUID="472fed14-7cb5-47c1-8f39-8fbe93f72718" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 17 16:42:44.996185 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:44.996141 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" podUID="7c3b027b-aaf4-4013-b294-9634cef66a61" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 17 16:42:54.062446 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:54.062401 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" podUID="472fed14-7cb5-47c1-8f39-8fbe93f72718" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 17 16:42:54.996190 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:42:54.996152 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" podUID="7c3b027b-aaf4-4013-b294-9634cef66a61" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 17 16:43:04.061758 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:04.061717 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" podUID="472fed14-7cb5-47c1-8f39-8fbe93f72718" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 17 16:43:04.997397 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:04.997369 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" Apr 17 16:43:14.062227 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:14.062195 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" Apr 17 16:43:33.674499 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.674409 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k"] Apr 17 16:43:33.675034 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.674843 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" podUID="7c3b027b-aaf4-4013-b294-9634cef66a61" containerName="kserve-container" containerID="cri-o://fb4e98363c885fa93f5e64ae016b2eff97d977e3be25f9aecc252760a2fe00c8" gracePeriod=30 Apr 17 16:43:33.675034 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.674999 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" podUID="7c3b027b-aaf4-4013-b294-9634cef66a61" containerName="kube-rbac-proxy" containerID="cri-o://8237805f5fa13660fced6f433217e62a2e0f8fc394aee9aac1503ae48025bdc5" gracePeriod=30 Apr 17 16:43:33.765813 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.765779 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f"] Apr 17 16:43:33.766186 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.766173 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="agent" Apr 17 16:43:33.766229 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.766188 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="agent" Apr 17 16:43:33.766229 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.766199 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="storage-initializer" Apr 17 16:43:33.766229 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.766205 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="storage-initializer" Apr 17 16:43:33.766229 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.766220 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="kserve-container" Apr 17 16:43:33.766229 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.766226 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="kserve-container" Apr 17 16:43:33.766394 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.766233 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="kube-rbac-proxy" Apr 17 16:43:33.766394 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.766238 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="kube-rbac-proxy" Apr 17 16:43:33.766394 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.766314 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="kube-rbac-proxy" Apr 17 16:43:33.766394 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.766327 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="kserve-container" Apr 17 16:43:33.766394 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.766338 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6947aac7-9c49-4c08-9cc4-060fa22367f2" containerName="agent" Apr 17 16:43:33.769676 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.769657 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" Apr 17 16:43:33.773778 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.773753 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-10815-kube-rbac-proxy-sar-config\"" Apr 17 16:43:33.773899 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.773756 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-10815-predictor-serving-cert\"" Apr 17 16:43:33.783809 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.783786 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f"] Apr 17 16:43:33.827685 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.827649 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j"] Apr 17 16:43:33.828022 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.827995 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" podUID="472fed14-7cb5-47c1-8f39-8fbe93f72718" containerName="kserve-container" containerID="cri-o://7f0a48ddf78873f63731a016b11faa31bacd53564b2496f8adccbcbd131290d2" gracePeriod=30 Apr 17 16:43:33.828136 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.828087 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" podUID="472fed14-7cb5-47c1-8f39-8fbe93f72718" containerName="kube-rbac-proxy" containerID="cri-o://60de02db1ead812e14057a7d3ac22a8a5fdfd85c8cb25993c44023e0009bd1a3" gracePeriod=30 Apr 17 16:43:33.875928 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.875897 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnnbr\" (UniqueName: \"kubernetes.io/projected/e7f33a07-6e34-4ae0-be9e-ee66e6a59b01-kube-api-access-vnnbr\") pod \"isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f\" (UID: \"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" Apr 17 16:43:33.876038 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.875930 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7f33a07-6e34-4ae0-be9e-ee66e6a59b01-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f\" (UID: \"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" Apr 17 16:43:33.876038 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.875965 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7f33a07-6e34-4ae0-be9e-ee66e6a59b01-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f\" (UID: \"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" Apr 17 16:43:33.876109 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.876052 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-hpa-10815-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e7f33a07-6e34-4ae0-be9e-ee66e6a59b01-isvc-sklearn-graph-raw-hpa-10815-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f\" (UID: \"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" Apr 17 16:43:33.891917 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.891890 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq"] Apr 17 16:43:33.895468 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.895453 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" Apr 17 16:43:33.898179 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.898158 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-10815-kube-rbac-proxy-sar-config\"" Apr 17 16:43:33.898456 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.898439 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-10815-predictor-serving-cert\"" Apr 17 16:43:33.906453 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.906426 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq"] Apr 17 16:43:33.977041 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.977002 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f113866f-63b7-45d3-bce0-ac0b5a81c137-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq\" (UID: \"f113866f-63b7-45d3-bce0-ac0b5a81c137\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" Apr 17 16:43:33.977205 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.977050 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f113866f-63b7-45d3-bce0-ac0b5a81c137-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq\" (UID: \"f113866f-63b7-45d3-bce0-ac0b5a81c137\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" Apr 17 16:43:33.977205 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.977149 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vnnbr\" (UniqueName: \"kubernetes.io/projected/e7f33a07-6e34-4ae0-be9e-ee66e6a59b01-kube-api-access-vnnbr\") pod \"isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f\" (UID: \"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" Apr 17 16:43:33.977205 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.977179 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7f33a07-6e34-4ae0-be9e-ee66e6a59b01-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f\" (UID: \"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" Apr 17 16:43:33.977402 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.977209 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7f33a07-6e34-4ae0-be9e-ee66e6a59b01-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f\" (UID: \"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" Apr 17 16:43:33.977464 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.977394 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-hpa-10815-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f113866f-63b7-45d3-bce0-ac0b5a81c137-isvc-xgboost-graph-raw-hpa-10815-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq\" (UID: \"f113866f-63b7-45d3-bce0-ac0b5a81c137\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" Apr 17 16:43:33.977528 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.977457 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-hpa-10815-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e7f33a07-6e34-4ae0-be9e-ee66e6a59b01-isvc-sklearn-graph-raw-hpa-10815-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f\" (UID: \"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" Apr 17 16:43:33.977528 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.977497 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxzk2\" (UniqueName: \"kubernetes.io/projected/f113866f-63b7-45d3-bce0-ac0b5a81c137-kube-api-access-rxzk2\") pod \"isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq\" (UID: \"f113866f-63b7-45d3-bce0-ac0b5a81c137\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" Apr 17 16:43:33.977638 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.977619 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7f33a07-6e34-4ae0-be9e-ee66e6a59b01-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f\" (UID: \"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" Apr 17 16:43:33.977966 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.977942 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-hpa-10815-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e7f33a07-6e34-4ae0-be9e-ee66e6a59b01-isvc-sklearn-graph-raw-hpa-10815-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f\" (UID: \"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" Apr 17 16:43:33.979624 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.979606 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7f33a07-6e34-4ae0-be9e-ee66e6a59b01-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f\" (UID: \"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" Apr 17 16:43:33.985655 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:33.985634 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnnbr\" (UniqueName: \"kubernetes.io/projected/e7f33a07-6e34-4ae0-be9e-ee66e6a59b01-kube-api-access-vnnbr\") pod \"isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f\" (UID: \"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" Apr 17 16:43:34.057122 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:34.057081 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" podUID="472fed14-7cb5-47c1-8f39-8fbe93f72718" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 17 16:43:34.062616 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:34.062586 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" podUID="472fed14-7cb5-47c1-8f39-8fbe93f72718" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 17 16:43:34.078495 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:34.078465 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxzk2\" (UniqueName: \"kubernetes.io/projected/f113866f-63b7-45d3-bce0-ac0b5a81c137-kube-api-access-rxzk2\") pod \"isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq\" (UID: \"f113866f-63b7-45d3-bce0-ac0b5a81c137\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" Apr 17 16:43:34.078607 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:34.078515 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f113866f-63b7-45d3-bce0-ac0b5a81c137-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq\" (UID: \"f113866f-63b7-45d3-bce0-ac0b5a81c137\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" Apr 17 16:43:34.078710 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:34.078680 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f113866f-63b7-45d3-bce0-ac0b5a81c137-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq\" (UID: \"f113866f-63b7-45d3-bce0-ac0b5a81c137\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" Apr 17 16:43:34.078815 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:34.078801 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-hpa-10815-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f113866f-63b7-45d3-bce0-ac0b5a81c137-isvc-xgboost-graph-raw-hpa-10815-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq\" (UID: \"f113866f-63b7-45d3-bce0-ac0b5a81c137\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" Apr 17 16:43:34.078864 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:43:34.078814 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-serving-cert: secret "isvc-xgboost-graph-raw-hpa-10815-predictor-serving-cert" not found Apr 17 16:43:34.078907 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:43:34.078875 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f113866f-63b7-45d3-bce0-ac0b5a81c137-proxy-tls podName:f113866f-63b7-45d3-bce0-ac0b5a81c137 nodeName:}" failed. No retries permitted until 2026-04-17 16:43:34.578859136 +0000 UTC m=+712.101884698 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f113866f-63b7-45d3-bce0-ac0b5a81c137-proxy-tls") pod "isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" (UID: "f113866f-63b7-45d3-bce0-ac0b5a81c137") : secret "isvc-xgboost-graph-raw-hpa-10815-predictor-serving-cert" not found Apr 17 16:43:34.078907 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:34.078883 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f113866f-63b7-45d3-bce0-ac0b5a81c137-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq\" (UID: \"f113866f-63b7-45d3-bce0-ac0b5a81c137\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" Apr 17 16:43:34.079367 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:34.079349 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-hpa-10815-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f113866f-63b7-45d3-bce0-ac0b5a81c137-isvc-xgboost-graph-raw-hpa-10815-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq\" (UID: \"f113866f-63b7-45d3-bce0-ac0b5a81c137\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" Apr 17 16:43:34.080266 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:34.080234 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" Apr 17 16:43:34.087181 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:34.087156 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxzk2\" (UniqueName: \"kubernetes.io/projected/f113866f-63b7-45d3-bce0-ac0b5a81c137-kube-api-access-rxzk2\") pod \"isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq\" (UID: \"f113866f-63b7-45d3-bce0-ac0b5a81c137\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" Apr 17 16:43:34.200916 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:34.200839 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f"] Apr 17 16:43:34.203676 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:43:34.203647 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7f33a07_6e34_4ae0_be9e_ee66e6a59b01.slice/crio-0cdca1dffb3d5b34af43d886ae229a5842ca596a15b34f60a89466252d48f80c WatchSource:0}: Error finding container 0cdca1dffb3d5b34af43d886ae229a5842ca596a15b34f60a89466252d48f80c: Status 404 returned error can't find the container with id 0cdca1dffb3d5b34af43d886ae229a5842ca596a15b34f60a89466252d48f80c Apr 17 16:43:34.315036 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:34.314994 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" event={"ID":"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01","Type":"ContainerStarted","Data":"94cdc45bf28c449cebb97bcccc6c96f6f12f1cabe4ff377bf766e5b2b57e04f1"} Apr 17 16:43:34.315036 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:34.315033 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" event={"ID":"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01","Type":"ContainerStarted","Data":"0cdca1dffb3d5b34af43d886ae229a5842ca596a15b34f60a89466252d48f80c"} Apr 17 16:43:34.317222 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:34.317192 2569 generic.go:358] "Generic (PLEG): container finished" podID="7c3b027b-aaf4-4013-b294-9634cef66a61" containerID="8237805f5fa13660fced6f433217e62a2e0f8fc394aee9aac1503ae48025bdc5" exitCode=2 Apr 17 16:43:34.317367 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:34.317231 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" event={"ID":"7c3b027b-aaf4-4013-b294-9634cef66a61","Type":"ContainerDied","Data":"8237805f5fa13660fced6f433217e62a2e0f8fc394aee9aac1503ae48025bdc5"} Apr 17 16:43:34.319641 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:34.319613 2569 generic.go:358] "Generic (PLEG): container finished" podID="472fed14-7cb5-47c1-8f39-8fbe93f72718" containerID="60de02db1ead812e14057a7d3ac22a8a5fdfd85c8cb25993c44023e0009bd1a3" exitCode=2 Apr 17 16:43:34.319762 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:34.319691 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" event={"ID":"472fed14-7cb5-47c1-8f39-8fbe93f72718","Type":"ContainerDied","Data":"60de02db1ead812e14057a7d3ac22a8a5fdfd85c8cb25993c44023e0009bd1a3"} Apr 17 16:43:34.582131 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:34.582041 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f113866f-63b7-45d3-bce0-ac0b5a81c137-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq\" (UID: \"f113866f-63b7-45d3-bce0-ac0b5a81c137\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" Apr 17 16:43:34.584463 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:34.584445 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f113866f-63b7-45d3-bce0-ac0b5a81c137-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq\" (UID: \"f113866f-63b7-45d3-bce0-ac0b5a81c137\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" Apr 17 16:43:34.806005 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:34.805954 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" Apr 17 16:43:34.926266 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:34.926220 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq"] Apr 17 16:43:34.929669 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:43:34.929642 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf113866f_63b7_45d3_bce0_ac0b5a81c137.slice/crio-7fa1d478253e039e97c4827f741114f73aa0c8b58bbb07b45bf6ed297bb7ae8e WatchSource:0}: Error finding container 7fa1d478253e039e97c4827f741114f73aa0c8b58bbb07b45bf6ed297bb7ae8e: Status 404 returned error can't find the container with id 7fa1d478253e039e97c4827f741114f73aa0c8b58bbb07b45bf6ed297bb7ae8e Apr 17 16:43:34.991824 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:34.991799 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" podUID="7c3b027b-aaf4-4013-b294-9634cef66a61" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.21:8643/healthz\": dial tcp 10.134.0.21:8643: connect: connection refused" Apr 17 16:43:34.996208 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:34.996183 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" podUID="7c3b027b-aaf4-4013-b294-9634cef66a61" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 17 16:43:35.325849 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:35.325798 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" event={"ID":"f113866f-63b7-45d3-bce0-ac0b5a81c137","Type":"ContainerStarted","Data":"f22108bebaec614f153918fb4f27181372b24b37531760f916b9d4f224409dbc"} Apr 17 16:43:35.325849 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:35.325849 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" event={"ID":"f113866f-63b7-45d3-bce0-ac0b5a81c137","Type":"ContainerStarted","Data":"7fa1d478253e039e97c4827f741114f73aa0c8b58bbb07b45bf6ed297bb7ae8e"} Apr 17 16:43:37.771481 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:37.771458 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" Apr 17 16:43:37.917049 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:37.916958 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-f88b3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/472fed14-7cb5-47c1-8f39-8fbe93f72718-isvc-xgboost-graph-raw-f88b3-kube-rbac-proxy-sar-config\") pod \"472fed14-7cb5-47c1-8f39-8fbe93f72718\" (UID: \"472fed14-7cb5-47c1-8f39-8fbe93f72718\") " Apr 17 16:43:37.917202 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:37.917074 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/472fed14-7cb5-47c1-8f39-8fbe93f72718-proxy-tls\") pod \"472fed14-7cb5-47c1-8f39-8fbe93f72718\" (UID: \"472fed14-7cb5-47c1-8f39-8fbe93f72718\") " Apr 17 16:43:37.917202 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:37.917112 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/472fed14-7cb5-47c1-8f39-8fbe93f72718-kserve-provision-location\") pod \"472fed14-7cb5-47c1-8f39-8fbe93f72718\" (UID: \"472fed14-7cb5-47c1-8f39-8fbe93f72718\") " Apr 17 16:43:37.917202 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:37.917128 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv7vr\" (UniqueName: \"kubernetes.io/projected/472fed14-7cb5-47c1-8f39-8fbe93f72718-kube-api-access-qv7vr\") pod \"472fed14-7cb5-47c1-8f39-8fbe93f72718\" (UID: \"472fed14-7cb5-47c1-8f39-8fbe93f72718\") " Apr 17 16:43:37.917467 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:37.917431 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/472fed14-7cb5-47c1-8f39-8fbe93f72718-isvc-xgboost-graph-raw-f88b3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-f88b3-kube-rbac-proxy-sar-config") pod "472fed14-7cb5-47c1-8f39-8fbe93f72718" (UID: "472fed14-7cb5-47c1-8f39-8fbe93f72718"). InnerVolumeSpecName "isvc-xgboost-graph-raw-f88b3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:43:37.917510 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:37.917475 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/472fed14-7cb5-47c1-8f39-8fbe93f72718-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "472fed14-7cb5-47c1-8f39-8fbe93f72718" (UID: "472fed14-7cb5-47c1-8f39-8fbe93f72718"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:43:37.919245 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:37.919217 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/472fed14-7cb5-47c1-8f39-8fbe93f72718-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "472fed14-7cb5-47c1-8f39-8fbe93f72718" (UID: "472fed14-7cb5-47c1-8f39-8fbe93f72718"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:43:37.919405 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:37.919359 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/472fed14-7cb5-47c1-8f39-8fbe93f72718-kube-api-access-qv7vr" (OuterVolumeSpecName: "kube-api-access-qv7vr") pod "472fed14-7cb5-47c1-8f39-8fbe93f72718" (UID: "472fed14-7cb5-47c1-8f39-8fbe93f72718"). InnerVolumeSpecName "kube-api-access-qv7vr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:43:38.017922 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.017878 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/472fed14-7cb5-47c1-8f39-8fbe93f72718-proxy-tls\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:43:38.017922 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.017912 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/472fed14-7cb5-47c1-8f39-8fbe93f72718-kserve-provision-location\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:43:38.017922 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.017924 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qv7vr\" (UniqueName: \"kubernetes.io/projected/472fed14-7cb5-47c1-8f39-8fbe93f72718-kube-api-access-qv7vr\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:43:38.018152 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.017935 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-f88b3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/472fed14-7cb5-47c1-8f39-8fbe93f72718-isvc-xgboost-graph-raw-f88b3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:43:38.337982 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.337949 2569 generic.go:358] "Generic (PLEG): container finished" podID="e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" containerID="94cdc45bf28c449cebb97bcccc6c96f6f12f1cabe4ff377bf766e5b2b57e04f1" exitCode=0 Apr 17 16:43:38.338146 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.338035 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" event={"ID":"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01","Type":"ContainerDied","Data":"94cdc45bf28c449cebb97bcccc6c96f6f12f1cabe4ff377bf766e5b2b57e04f1"} Apr 17 16:43:38.340305 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.340066 2569 generic.go:358] "Generic (PLEG): container finished" podID="7c3b027b-aaf4-4013-b294-9634cef66a61" containerID="fb4e98363c885fa93f5e64ae016b2eff97d977e3be25f9aecc252760a2fe00c8" exitCode=0 Apr 17 16:43:38.340305 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.340149 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" event={"ID":"7c3b027b-aaf4-4013-b294-9634cef66a61","Type":"ContainerDied","Data":"fb4e98363c885fa93f5e64ae016b2eff97d977e3be25f9aecc252760a2fe00c8"} Apr 17 16:43:38.342300 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.342270 2569 generic.go:358] "Generic (PLEG): container finished" podID="472fed14-7cb5-47c1-8f39-8fbe93f72718" containerID="7f0a48ddf78873f63731a016b11faa31bacd53564b2496f8adccbcbd131290d2" exitCode=0 Apr 17 16:43:38.342418 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.342381 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" event={"ID":"472fed14-7cb5-47c1-8f39-8fbe93f72718","Type":"ContainerDied","Data":"7f0a48ddf78873f63731a016b11faa31bacd53564b2496f8adccbcbd131290d2"} Apr 17 16:43:38.342418 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.342390 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" Apr 17 16:43:38.342418 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.342409 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j" event={"ID":"472fed14-7cb5-47c1-8f39-8fbe93f72718","Type":"ContainerDied","Data":"c8fc626c9853d19974bfad228f810bb52230c4287a133ea256ae6bab14927edf"} Apr 17 16:43:38.342582 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.342429 2569 scope.go:117] "RemoveContainer" containerID="60de02db1ead812e14057a7d3ac22a8a5fdfd85c8cb25993c44023e0009bd1a3" Apr 17 16:43:38.355276 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.355234 2569 scope.go:117] "RemoveContainer" containerID="7f0a48ddf78873f63731a016b11faa31bacd53564b2496f8adccbcbd131290d2" Apr 17 16:43:38.365802 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.365780 2569 scope.go:117] "RemoveContainer" containerID="05c27fe9260b23ed2d293d047e53f37fb9ab06e69635e947eac6b3967026f56b" Apr 17 16:43:38.371919 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.371896 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j"] Apr 17 16:43:38.375604 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.375582 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-f88b3-predictor-55768ff57-2b49j"] Apr 17 16:43:38.385050 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.385029 2569 scope.go:117] "RemoveContainer" containerID="60de02db1ead812e14057a7d3ac22a8a5fdfd85c8cb25993c44023e0009bd1a3" Apr 17 16:43:38.385525 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:43:38.385500 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60de02db1ead812e14057a7d3ac22a8a5fdfd85c8cb25993c44023e0009bd1a3\": container with ID starting with 60de02db1ead812e14057a7d3ac22a8a5fdfd85c8cb25993c44023e0009bd1a3 not found: ID does not exist" containerID="60de02db1ead812e14057a7d3ac22a8a5fdfd85c8cb25993c44023e0009bd1a3" Apr 17 16:43:38.385629 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.385532 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60de02db1ead812e14057a7d3ac22a8a5fdfd85c8cb25993c44023e0009bd1a3"} err="failed to get container status \"60de02db1ead812e14057a7d3ac22a8a5fdfd85c8cb25993c44023e0009bd1a3\": rpc error: code = NotFound desc = could not find container \"60de02db1ead812e14057a7d3ac22a8a5fdfd85c8cb25993c44023e0009bd1a3\": container with ID starting with 60de02db1ead812e14057a7d3ac22a8a5fdfd85c8cb25993c44023e0009bd1a3 not found: ID does not exist" Apr 17 16:43:38.385629 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.385551 2569 scope.go:117] "RemoveContainer" containerID="7f0a48ddf78873f63731a016b11faa31bacd53564b2496f8adccbcbd131290d2" Apr 17 16:43:38.385845 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:43:38.385817 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f0a48ddf78873f63731a016b11faa31bacd53564b2496f8adccbcbd131290d2\": container with ID starting with 7f0a48ddf78873f63731a016b11faa31bacd53564b2496f8adccbcbd131290d2 not found: ID does not exist" containerID="7f0a48ddf78873f63731a016b11faa31bacd53564b2496f8adccbcbd131290d2" Apr 17 16:43:38.385942 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.385855 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f0a48ddf78873f63731a016b11faa31bacd53564b2496f8adccbcbd131290d2"} err="failed to get container status \"7f0a48ddf78873f63731a016b11faa31bacd53564b2496f8adccbcbd131290d2\": rpc error: code = NotFound desc = could not find container \"7f0a48ddf78873f63731a016b11faa31bacd53564b2496f8adccbcbd131290d2\": container with ID starting with 7f0a48ddf78873f63731a016b11faa31bacd53564b2496f8adccbcbd131290d2 not found: ID does not exist" Apr 17 16:43:38.385942 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.385878 2569 scope.go:117] "RemoveContainer" containerID="05c27fe9260b23ed2d293d047e53f37fb9ab06e69635e947eac6b3967026f56b" Apr 17 16:43:38.386166 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:43:38.386145 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05c27fe9260b23ed2d293d047e53f37fb9ab06e69635e947eac6b3967026f56b\": container with ID starting with 05c27fe9260b23ed2d293d047e53f37fb9ab06e69635e947eac6b3967026f56b not found: ID does not exist" containerID="05c27fe9260b23ed2d293d047e53f37fb9ab06e69635e947eac6b3967026f56b" Apr 17 16:43:38.386240 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.386169 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05c27fe9260b23ed2d293d047e53f37fb9ab06e69635e947eac6b3967026f56b"} err="failed to get container status \"05c27fe9260b23ed2d293d047e53f37fb9ab06e69635e947eac6b3967026f56b\": rpc error: code = NotFound desc = could not find container \"05c27fe9260b23ed2d293d047e53f37fb9ab06e69635e947eac6b3967026f56b\": container with ID starting with 05c27fe9260b23ed2d293d047e53f37fb9ab06e69635e947eac6b3967026f56b not found: ID does not exist" Apr 17 16:43:38.413635 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.413612 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" Apr 17 16:43:38.522120 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.522027 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c3b027b-aaf4-4013-b294-9634cef66a61-kserve-provision-location\") pod \"7c3b027b-aaf4-4013-b294-9634cef66a61\" (UID: \"7c3b027b-aaf4-4013-b294-9634cef66a61\") " Apr 17 16:43:38.522120 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.522107 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c3b027b-aaf4-4013-b294-9634cef66a61-proxy-tls\") pod \"7c3b027b-aaf4-4013-b294-9634cef66a61\" (UID: \"7c3b027b-aaf4-4013-b294-9634cef66a61\") " Apr 17 16:43:38.522360 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.522173 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6n4n\" (UniqueName: \"kubernetes.io/projected/7c3b027b-aaf4-4013-b294-9634cef66a61-kube-api-access-c6n4n\") pod \"7c3b027b-aaf4-4013-b294-9634cef66a61\" (UID: \"7c3b027b-aaf4-4013-b294-9634cef66a61\") " Apr 17 16:43:38.522360 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.522219 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-f88b3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7c3b027b-aaf4-4013-b294-9634cef66a61-isvc-sklearn-graph-raw-f88b3-kube-rbac-proxy-sar-config\") pod \"7c3b027b-aaf4-4013-b294-9634cef66a61\" (UID: \"7c3b027b-aaf4-4013-b294-9634cef66a61\") " Apr 17 16:43:38.522502 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.522472 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c3b027b-aaf4-4013-b294-9634cef66a61-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7c3b027b-aaf4-4013-b294-9634cef66a61" (UID: "7c3b027b-aaf4-4013-b294-9634cef66a61"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:43:38.522689 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.522658 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c3b027b-aaf4-4013-b294-9634cef66a61-isvc-sklearn-graph-raw-f88b3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-f88b3-kube-rbac-proxy-sar-config") pod "7c3b027b-aaf4-4013-b294-9634cef66a61" (UID: "7c3b027b-aaf4-4013-b294-9634cef66a61"). InnerVolumeSpecName "isvc-sklearn-graph-raw-f88b3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:43:38.524284 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.524241 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3b027b-aaf4-4013-b294-9634cef66a61-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7c3b027b-aaf4-4013-b294-9634cef66a61" (UID: "7c3b027b-aaf4-4013-b294-9634cef66a61"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:43:38.524380 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.524312 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c3b027b-aaf4-4013-b294-9634cef66a61-kube-api-access-c6n4n" (OuterVolumeSpecName: "kube-api-access-c6n4n") pod "7c3b027b-aaf4-4013-b294-9634cef66a61" (UID: "7c3b027b-aaf4-4013-b294-9634cef66a61"). InnerVolumeSpecName "kube-api-access-c6n4n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:43:38.623872 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.623831 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c3b027b-aaf4-4013-b294-9634cef66a61-proxy-tls\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:43:38.623872 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.623864 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c6n4n\" (UniqueName: \"kubernetes.io/projected/7c3b027b-aaf4-4013-b294-9634cef66a61-kube-api-access-c6n4n\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:43:38.624064 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.623882 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-f88b3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7c3b027b-aaf4-4013-b294-9634cef66a61-isvc-sklearn-graph-raw-f88b3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:43:38.624064 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:38.623896 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c3b027b-aaf4-4013-b294-9634cef66a61-kserve-provision-location\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:43:39.041982 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:39.041947 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="472fed14-7cb5-47c1-8f39-8fbe93f72718" path="/var/lib/kubelet/pods/472fed14-7cb5-47c1-8f39-8fbe93f72718/volumes" Apr 17 16:43:39.348405 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:39.348305 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" event={"ID":"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01","Type":"ContainerStarted","Data":"c87921d7cb76b5aa16698f58362cf2e2cfb0c94b5e68ea7884a68463b3e15e7c"} Apr 17 16:43:39.348405 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:39.348348 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" event={"ID":"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01","Type":"ContainerStarted","Data":"cef52e4350ef905c6defd31a51eb61d48582502ddce56148d8fdbe9ac8485bd2"} Apr 17 16:43:39.348729 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:39.348701 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" Apr 17 16:43:39.348729 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:39.348733 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" Apr 17 16:43:39.350035 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:39.350005 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" event={"ID":"7c3b027b-aaf4-4013-b294-9634cef66a61","Type":"ContainerDied","Data":"e4b11df23d23e05c80608eb0286ce3857f5b540ec34b7ba0efa8a2700f47fc35"} Apr 17 16:43:39.350146 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:39.350041 2569 scope.go:117] "RemoveContainer" containerID="8237805f5fa13660fced6f433217e62a2e0f8fc394aee9aac1503ae48025bdc5" Apr 17 16:43:39.350146 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:39.350040 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k" Apr 17 16:43:39.350446 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:39.350402 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" podUID="e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 16:43:39.351485 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:39.351463 2569 generic.go:358] "Generic (PLEG): container finished" podID="f113866f-63b7-45d3-bce0-ac0b5a81c137" containerID="f22108bebaec614f153918fb4f27181372b24b37531760f916b9d4f224409dbc" exitCode=0 Apr 17 16:43:39.351567 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:39.351531 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" event={"ID":"f113866f-63b7-45d3-bce0-ac0b5a81c137","Type":"ContainerDied","Data":"f22108bebaec614f153918fb4f27181372b24b37531760f916b9d4f224409dbc"} Apr 17 16:43:39.358318 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:39.358296 2569 scope.go:117] "RemoveContainer" containerID="fb4e98363c885fa93f5e64ae016b2eff97d977e3be25f9aecc252760a2fe00c8" Apr 17 16:43:39.365650 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:39.365632 2569 scope.go:117] "RemoveContainer" containerID="7fa8ba14cad7e2d78a3464454d751eeeed55b995d8713bd74319ccc2cfc0d26c" Apr 17 16:43:39.369318 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:39.369266 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" podStartSLOduration=6.369232414 podStartE2EDuration="6.369232414s" podCreationTimestamp="2026-04-17 16:43:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:43:39.366691549 +0000 UTC m=+716.889717133" watchObservedRunningTime="2026-04-17 16:43:39.369232414 +0000 UTC m=+716.892258001" Apr 17 16:43:39.396660 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:39.396633 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k"] Apr 17 16:43:39.402658 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:39.402629 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-f88b3-predictor-65f4fbb7f7-jpg8k"] Apr 17 16:43:40.359087 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:40.359052 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" event={"ID":"f113866f-63b7-45d3-bce0-ac0b5a81c137","Type":"ContainerStarted","Data":"275a17c1a065725861e66ac39428bb0de56e53660ad06a50362f9f292037d3ad"} Apr 17 16:43:40.359087 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:40.359090 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" event={"ID":"f113866f-63b7-45d3-bce0-ac0b5a81c137","Type":"ContainerStarted","Data":"98ef4b9e6524fdac114d76c2c6cb6fcb5c96492ba57399b3a731c4b286df155c"} Apr 17 16:43:40.359545 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:40.359489 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" podUID="e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 16:43:40.359545 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:40.359515 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" Apr 17 16:43:40.359646 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:40.359628 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" Apr 17 16:43:40.360956 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:40.360927 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" podUID="f113866f-63b7-45d3-bce0-ac0b5a81c137" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 17 16:43:40.377563 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:40.377527 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" podStartSLOduration=7.377514676 podStartE2EDuration="7.377514676s" podCreationTimestamp="2026-04-17 16:43:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:43:40.375897428 +0000 UTC m=+717.898923012" watchObservedRunningTime="2026-04-17 16:43:40.377514676 +0000 UTC m=+717.900540306" Apr 17 16:43:41.041549 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:41.041512 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c3b027b-aaf4-4013-b294-9634cef66a61" path="/var/lib/kubelet/pods/7c3b027b-aaf4-4013-b294-9634cef66a61/volumes" Apr 17 16:43:41.362808 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:41.362767 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" podUID="f113866f-63b7-45d3-bce0-ac0b5a81c137" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 17 16:43:45.370391 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:45.370362 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" Apr 17 16:43:45.371031 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:45.371001 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" podUID="e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 16:43:46.367046 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:46.367019 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" Apr 17 16:43:46.367659 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:46.367625 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" podUID="f113866f-63b7-45d3-bce0-ac0b5a81c137" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 17 16:43:55.371857 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:55.371819 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" podUID="e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 16:43:56.368117 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:43:56.368078 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" podUID="f113866f-63b7-45d3-bce0-ac0b5a81c137" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 17 16:44:05.371139 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:44:05.371096 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" podUID="e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 16:44:06.368144 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:44:06.368103 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" podUID="f113866f-63b7-45d3-bce0-ac0b5a81c137" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 17 16:44:15.371707 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:44:15.371665 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" podUID="e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 16:44:16.367899 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:44:16.367858 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" podUID="f113866f-63b7-45d3-bce0-ac0b5a81c137" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 17 16:44:25.371669 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:44:25.371631 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" podUID="e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 16:44:26.367549 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:44:26.367512 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" podUID="f113866f-63b7-45d3-bce0-ac0b5a81c137" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 17 16:44:35.371731 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:44:35.371693 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" podUID="e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 16:44:36.367777 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:44:36.367735 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" podUID="f113866f-63b7-45d3-bce0-ac0b5a81c137" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 17 16:44:45.372168 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:44:45.372141 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" Apr 17 16:44:46.368432 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:44:46.368397 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" Apr 17 16:45:13.965070 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:13.964990 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f"] Apr 17 16:45:13.965533 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:13.965340 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" podUID="e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" containerName="kserve-container" containerID="cri-o://cef52e4350ef905c6defd31a51eb61d48582502ddce56148d8fdbe9ac8485bd2" gracePeriod=30 Apr 17 16:45:13.965533 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:13.965394 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" podUID="e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" containerName="kube-rbac-proxy" containerID="cri-o://c87921d7cb76b5aa16698f58362cf2e2cfb0c94b5e68ea7884a68463b3e15e7c" gracePeriod=30 Apr 17 16:45:14.021279 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.021230 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr"] Apr 17 16:45:14.021595 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.021582 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7c3b027b-aaf4-4013-b294-9634cef66a61" containerName="kube-rbac-proxy" Apr 17 16:45:14.021640 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.021597 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c3b027b-aaf4-4013-b294-9634cef66a61" containerName="kube-rbac-proxy" Apr 17 16:45:14.021640 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.021605 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="472fed14-7cb5-47c1-8f39-8fbe93f72718" containerName="kserve-container" Apr 17 16:45:14.021640 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.021611 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="472fed14-7cb5-47c1-8f39-8fbe93f72718" containerName="kserve-container" Apr 17 16:45:14.021640 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.021624 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="472fed14-7cb5-47c1-8f39-8fbe93f72718" containerName="kube-rbac-proxy" Apr 17 16:45:14.021640 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.021630 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="472fed14-7cb5-47c1-8f39-8fbe93f72718" containerName="kube-rbac-proxy" Apr 17 16:45:14.021640 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.021638 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="472fed14-7cb5-47c1-8f39-8fbe93f72718" containerName="storage-initializer" Apr 17 16:45:14.021828 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.021643 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="472fed14-7cb5-47c1-8f39-8fbe93f72718" containerName="storage-initializer" Apr 17 16:45:14.021828 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.021658 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7c3b027b-aaf4-4013-b294-9634cef66a61" containerName="storage-initializer" Apr 17 16:45:14.021828 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.021662 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c3b027b-aaf4-4013-b294-9634cef66a61" containerName="storage-initializer" Apr 17 16:45:14.021828 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.021672 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7c3b027b-aaf4-4013-b294-9634cef66a61" containerName="kserve-container" Apr 17 16:45:14.021828 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.021677 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c3b027b-aaf4-4013-b294-9634cef66a61" containerName="kserve-container" Apr 17 16:45:14.021828 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.021720 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="472fed14-7cb5-47c1-8f39-8fbe93f72718" containerName="kube-rbac-proxy" Apr 17 16:45:14.021828 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.021731 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="472fed14-7cb5-47c1-8f39-8fbe93f72718" containerName="kserve-container" Apr 17 16:45:14.021828 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.021736 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="7c3b027b-aaf4-4013-b294-9634cef66a61" containerName="kube-rbac-proxy" Apr 17 16:45:14.021828 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.021744 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="7c3b027b-aaf4-4013-b294-9634cef66a61" containerName="kserve-container" Apr 17 16:45:14.024351 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.024334 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr" Apr 17 16:45:14.026890 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.026861 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-cd35e-kube-rbac-proxy-sar-config\"" Apr 17 16:45:14.027007 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.026904 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-cd35e-predictor-serving-cert\"" Apr 17 16:45:14.033375 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.033348 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr"] Apr 17 16:45:14.065600 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.065568 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq"] Apr 17 16:45:14.065932 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.065907 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" podUID="f113866f-63b7-45d3-bce0-ac0b5a81c137" containerName="kserve-container" containerID="cri-o://98ef4b9e6524fdac114d76c2c6cb6fcb5c96492ba57399b3a731c4b286df155c" gracePeriod=30 Apr 17 16:45:14.066038 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.065945 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" podUID="f113866f-63b7-45d3-bce0-ac0b5a81c137" containerName="kube-rbac-proxy" containerID="cri-o://275a17c1a065725861e66ac39428bb0de56e53660ad06a50362f9f292037d3ad" gracePeriod=30 Apr 17 16:45:14.140215 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.140180 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3abe04ba-6a27-4d5f-b85d-e7df00cffb69-proxy-tls\") pod \"message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr\" (UID: \"3abe04ba-6a27-4d5f-b85d-e7df00cffb69\") " pod="kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr" Apr 17 16:45:14.140392 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.140228 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-raw-cd35e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3abe04ba-6a27-4d5f-b85d-e7df00cffb69-message-dumper-raw-cd35e-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr\" (UID: \"3abe04ba-6a27-4d5f-b85d-e7df00cffb69\") " pod="kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr" Apr 17 16:45:14.140392 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.140370 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zllb6\" (UniqueName: \"kubernetes.io/projected/3abe04ba-6a27-4d5f-b85d-e7df00cffb69-kube-api-access-zllb6\") pod \"message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr\" (UID: \"3abe04ba-6a27-4d5f-b85d-e7df00cffb69\") " pod="kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr" Apr 17 16:45:14.241286 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.241187 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3abe04ba-6a27-4d5f-b85d-e7df00cffb69-proxy-tls\") pod \"message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr\" (UID: \"3abe04ba-6a27-4d5f-b85d-e7df00cffb69\") " pod="kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr" Apr 17 16:45:14.241286 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.241239 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-raw-cd35e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3abe04ba-6a27-4d5f-b85d-e7df00cffb69-message-dumper-raw-cd35e-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr\" (UID: \"3abe04ba-6a27-4d5f-b85d-e7df00cffb69\") " pod="kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr" Apr 17 16:45:14.241487 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.241343 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zllb6\" (UniqueName: \"kubernetes.io/projected/3abe04ba-6a27-4d5f-b85d-e7df00cffb69-kube-api-access-zllb6\") pod \"message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr\" (UID: \"3abe04ba-6a27-4d5f-b85d-e7df00cffb69\") " pod="kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr" Apr 17 16:45:14.242134 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.242101 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-raw-cd35e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3abe04ba-6a27-4d5f-b85d-e7df00cffb69-message-dumper-raw-cd35e-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr\" (UID: \"3abe04ba-6a27-4d5f-b85d-e7df00cffb69\") " pod="kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr" Apr 17 16:45:14.243580 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.243558 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3abe04ba-6a27-4d5f-b85d-e7df00cffb69-proxy-tls\") pod \"message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr\" (UID: \"3abe04ba-6a27-4d5f-b85d-e7df00cffb69\") " pod="kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr" Apr 17 16:45:14.250854 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.250830 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zllb6\" (UniqueName: \"kubernetes.io/projected/3abe04ba-6a27-4d5f-b85d-e7df00cffb69-kube-api-access-zllb6\") pod \"message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr\" (UID: \"3abe04ba-6a27-4d5f-b85d-e7df00cffb69\") " pod="kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr" Apr 17 16:45:14.336273 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.336211 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr" Apr 17 16:45:14.456368 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.456335 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr"] Apr 17 16:45:14.459445 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:45:14.459416 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3abe04ba_6a27_4d5f_b85d_e7df00cffb69.slice/crio-86bc7f7484a3560e63e14f1405ca4ac6e63983306db8e4eaee2afa3676db58ee WatchSource:0}: Error finding container 86bc7f7484a3560e63e14f1405ca4ac6e63983306db8e4eaee2afa3676db58ee: Status 404 returned error can't find the container with id 86bc7f7484a3560e63e14f1405ca4ac6e63983306db8e4eaee2afa3676db58ee Apr 17 16:45:14.461115 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.461098 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:45:14.641742 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.641691 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr" event={"ID":"3abe04ba-6a27-4d5f-b85d-e7df00cffb69","Type":"ContainerStarted","Data":"86bc7f7484a3560e63e14f1405ca4ac6e63983306db8e4eaee2afa3676db58ee"} Apr 17 16:45:14.643604 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.643571 2569 generic.go:358] "Generic (PLEG): container finished" podID="e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" containerID="c87921d7cb76b5aa16698f58362cf2e2cfb0c94b5e68ea7884a68463b3e15e7c" exitCode=2 Apr 17 16:45:14.643748 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.643639 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" event={"ID":"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01","Type":"ContainerDied","Data":"c87921d7cb76b5aa16698f58362cf2e2cfb0c94b5e68ea7884a68463b3e15e7c"} Apr 17 16:45:14.645319 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.645298 2569 generic.go:358] "Generic (PLEG): container finished" podID="f113866f-63b7-45d3-bce0-ac0b5a81c137" containerID="275a17c1a065725861e66ac39428bb0de56e53660ad06a50362f9f292037d3ad" exitCode=2 Apr 17 16:45:14.645410 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:14.645331 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" event={"ID":"f113866f-63b7-45d3-bce0-ac0b5a81c137","Type":"ContainerDied","Data":"275a17c1a065725861e66ac39428bb0de56e53660ad06a50362f9f292037d3ad"} Apr 17 16:45:15.359717 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:15.359674 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" podUID="e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.23:8643/healthz\": dial tcp 10.134.0.23:8643: connect: connection refused" Apr 17 16:45:15.371063 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:15.371029 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" podUID="e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 17 16:45:16.363831 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:16.363787 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" podUID="f113866f-63b7-45d3-bce0-ac0b5a81c137" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.24:8643/healthz\": dial tcp 10.134.0.24:8643: connect: connection refused" Apr 17 16:45:16.368155 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:16.368130 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" podUID="f113866f-63b7-45d3-bce0-ac0b5a81c137" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 17 16:45:16.653971 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:16.653882 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr" event={"ID":"3abe04ba-6a27-4d5f-b85d-e7df00cffb69","Type":"ContainerStarted","Data":"7ca1c3ecf40c2b00aa22cebfcb203e8bfc1e10073d26fc709b5f3f8688525867"} Apr 17 16:45:16.653971 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:16.653925 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr" event={"ID":"3abe04ba-6a27-4d5f-b85d-e7df00cffb69","Type":"ContainerStarted","Data":"d7b8d3c14f48197cf849c6fd439958c7347648431cba32c5bc76048aefac730d"} Apr 17 16:45:16.654148 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:16.654098 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr" Apr 17 16:45:16.654148 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:16.654117 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr" Apr 17 16:45:16.655880 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:16.655858 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr" Apr 17 16:45:16.672016 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:16.671973 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr" podStartSLOduration=1.339840846 podStartE2EDuration="2.671959908s" podCreationTimestamp="2026-04-17 16:45:14 +0000 UTC" firstStartedPulling="2026-04-17 16:45:14.461271747 +0000 UTC m=+811.984297308" lastFinishedPulling="2026-04-17 16:45:15.793390795 +0000 UTC m=+813.316416370" observedRunningTime="2026-04-17 16:45:16.670321834 +0000 UTC m=+814.193347417" watchObservedRunningTime="2026-04-17 16:45:16.671959908 +0000 UTC m=+814.194985491" Apr 17 16:45:17.811940 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:17.811912 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" Apr 17 16:45:17.870127 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:17.870096 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-hpa-10815-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f113866f-63b7-45d3-bce0-ac0b5a81c137-isvc-xgboost-graph-raw-hpa-10815-kube-rbac-proxy-sar-config\") pod \"f113866f-63b7-45d3-bce0-ac0b5a81c137\" (UID: \"f113866f-63b7-45d3-bce0-ac0b5a81c137\") " Apr 17 16:45:17.870127 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:17.870130 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxzk2\" (UniqueName: \"kubernetes.io/projected/f113866f-63b7-45d3-bce0-ac0b5a81c137-kube-api-access-rxzk2\") pod \"f113866f-63b7-45d3-bce0-ac0b5a81c137\" (UID: \"f113866f-63b7-45d3-bce0-ac0b5a81c137\") " Apr 17 16:45:17.870347 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:17.870191 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f113866f-63b7-45d3-bce0-ac0b5a81c137-proxy-tls\") pod \"f113866f-63b7-45d3-bce0-ac0b5a81c137\" (UID: \"f113866f-63b7-45d3-bce0-ac0b5a81c137\") " Apr 17 16:45:17.870347 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:17.870218 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f113866f-63b7-45d3-bce0-ac0b5a81c137-kserve-provision-location\") pod \"f113866f-63b7-45d3-bce0-ac0b5a81c137\" (UID: \"f113866f-63b7-45d3-bce0-ac0b5a81c137\") " Apr 17 16:45:17.870524 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:17.870497 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f113866f-63b7-45d3-bce0-ac0b5a81c137-isvc-xgboost-graph-raw-hpa-10815-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-hpa-10815-kube-rbac-proxy-sar-config") pod "f113866f-63b7-45d3-bce0-ac0b5a81c137" (UID: "f113866f-63b7-45d3-bce0-ac0b5a81c137"). InnerVolumeSpecName "isvc-xgboost-graph-raw-hpa-10815-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:45:17.870656 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:17.870631 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f113866f-63b7-45d3-bce0-ac0b5a81c137-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f113866f-63b7-45d3-bce0-ac0b5a81c137" (UID: "f113866f-63b7-45d3-bce0-ac0b5a81c137"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:45:17.872347 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:17.872288 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f113866f-63b7-45d3-bce0-ac0b5a81c137-kube-api-access-rxzk2" (OuterVolumeSpecName: "kube-api-access-rxzk2") pod "f113866f-63b7-45d3-bce0-ac0b5a81c137" (UID: "f113866f-63b7-45d3-bce0-ac0b5a81c137"). InnerVolumeSpecName "kube-api-access-rxzk2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:45:17.872347 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:17.872301 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f113866f-63b7-45d3-bce0-ac0b5a81c137-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f113866f-63b7-45d3-bce0-ac0b5a81c137" (UID: "f113866f-63b7-45d3-bce0-ac0b5a81c137"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:45:17.971463 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:17.971424 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f113866f-63b7-45d3-bce0-ac0b5a81c137-proxy-tls\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:45:17.971463 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:17.971459 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f113866f-63b7-45d3-bce0-ac0b5a81c137-kserve-provision-location\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:45:17.971463 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:17.971470 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-hpa-10815-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f113866f-63b7-45d3-bce0-ac0b5a81c137-isvc-xgboost-graph-raw-hpa-10815-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:45:17.971688 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:17.971482 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rxzk2\" (UniqueName: \"kubernetes.io/projected/f113866f-63b7-45d3-bce0-ac0b5a81c137-kube-api-access-rxzk2\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:45:18.582772 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.582746 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" Apr 17 16:45:18.662962 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.662926 2569 generic.go:358] "Generic (PLEG): container finished" podID="e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" containerID="cef52e4350ef905c6defd31a51eb61d48582502ddce56148d8fdbe9ac8485bd2" exitCode=0 Apr 17 16:45:18.663116 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.663014 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" Apr 17 16:45:18.663116 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.663010 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" event={"ID":"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01","Type":"ContainerDied","Data":"cef52e4350ef905c6defd31a51eb61d48582502ddce56148d8fdbe9ac8485bd2"} Apr 17 16:45:18.663286 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.663126 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f" event={"ID":"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01","Type":"ContainerDied","Data":"0cdca1dffb3d5b34af43d886ae229a5842ca596a15b34f60a89466252d48f80c"} Apr 17 16:45:18.663286 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.663143 2569 scope.go:117] "RemoveContainer" containerID="c87921d7cb76b5aa16698f58362cf2e2cfb0c94b5e68ea7884a68463b3e15e7c" Apr 17 16:45:18.664956 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.664932 2569 generic.go:358] "Generic (PLEG): container finished" podID="f113866f-63b7-45d3-bce0-ac0b5a81c137" containerID="98ef4b9e6524fdac114d76c2c6cb6fcb5c96492ba57399b3a731c4b286df155c" exitCode=0 Apr 17 16:45:18.665091 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.664998 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" Apr 17 16:45:18.665091 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.665008 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" event={"ID":"f113866f-63b7-45d3-bce0-ac0b5a81c137","Type":"ContainerDied","Data":"98ef4b9e6524fdac114d76c2c6cb6fcb5c96492ba57399b3a731c4b286df155c"} Apr 17 16:45:18.665091 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.665042 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq" event={"ID":"f113866f-63b7-45d3-bce0-ac0b5a81c137","Type":"ContainerDied","Data":"7fa1d478253e039e97c4827f741114f73aa0c8b58bbb07b45bf6ed297bb7ae8e"} Apr 17 16:45:18.671439 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.671424 2569 scope.go:117] "RemoveContainer" containerID="cef52e4350ef905c6defd31a51eb61d48582502ddce56148d8fdbe9ac8485bd2" Apr 17 16:45:18.677541 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.677521 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-hpa-10815-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e7f33a07-6e34-4ae0-be9e-ee66e6a59b01-isvc-sklearn-graph-raw-hpa-10815-kube-rbac-proxy-sar-config\") pod \"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01\" (UID: \"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01\") " Apr 17 16:45:18.677652 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.677579 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7f33a07-6e34-4ae0-be9e-ee66e6a59b01-proxy-tls\") pod \"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01\" (UID: \"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01\") " Apr 17 16:45:18.677652 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.677635 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7f33a07-6e34-4ae0-be9e-ee66e6a59b01-kserve-provision-location\") pod \"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01\" (UID: \"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01\") " Apr 17 16:45:18.677743 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.677670 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnnbr\" (UniqueName: \"kubernetes.io/projected/e7f33a07-6e34-4ae0-be9e-ee66e6a59b01-kube-api-access-vnnbr\") pod \"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01\" (UID: \"e7f33a07-6e34-4ae0-be9e-ee66e6a59b01\") " Apr 17 16:45:18.677936 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.677905 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7f33a07-6e34-4ae0-be9e-ee66e6a59b01-isvc-sklearn-graph-raw-hpa-10815-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-hpa-10815-kube-rbac-proxy-sar-config") pod "e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" (UID: "e7f33a07-6e34-4ae0-be9e-ee66e6a59b01"). InnerVolumeSpecName "isvc-sklearn-graph-raw-hpa-10815-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:45:18.677995 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.677945 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7f33a07-6e34-4ae0-be9e-ee66e6a59b01-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" (UID: "e7f33a07-6e34-4ae0-be9e-ee66e6a59b01"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:45:18.679946 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.679923 2569 scope.go:117] "RemoveContainer" containerID="94cdc45bf28c449cebb97bcccc6c96f6f12f1cabe4ff377bf766e5b2b57e04f1" Apr 17 16:45:18.687796 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.687772 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq"] Apr 17 16:45:18.691611 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.691569 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7f33a07-6e34-4ae0-be9e-ee66e6a59b01-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" (UID: "e7f33a07-6e34-4ae0-be9e-ee66e6a59b01"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:45:18.691797 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.691779 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-10815-predictor-74fcc98b7c-lzczq"] Apr 17 16:45:18.695030 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.694860 2569 scope.go:117] "RemoveContainer" containerID="c87921d7cb76b5aa16698f58362cf2e2cfb0c94b5e68ea7884a68463b3e15e7c" Apr 17 16:45:18.695148 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:45:18.695128 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c87921d7cb76b5aa16698f58362cf2e2cfb0c94b5e68ea7884a68463b3e15e7c\": container with ID starting with c87921d7cb76b5aa16698f58362cf2e2cfb0c94b5e68ea7884a68463b3e15e7c not found: ID does not exist" containerID="c87921d7cb76b5aa16698f58362cf2e2cfb0c94b5e68ea7884a68463b3e15e7c" Apr 17 16:45:18.695199 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.695156 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c87921d7cb76b5aa16698f58362cf2e2cfb0c94b5e68ea7884a68463b3e15e7c"} err="failed to get container status \"c87921d7cb76b5aa16698f58362cf2e2cfb0c94b5e68ea7884a68463b3e15e7c\": rpc error: code = NotFound desc = could not find container \"c87921d7cb76b5aa16698f58362cf2e2cfb0c94b5e68ea7884a68463b3e15e7c\": container with ID starting with c87921d7cb76b5aa16698f58362cf2e2cfb0c94b5e68ea7884a68463b3e15e7c not found: ID does not exist" Apr 17 16:45:18.695199 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.695173 2569 scope.go:117] "RemoveContainer" containerID="cef52e4350ef905c6defd31a51eb61d48582502ddce56148d8fdbe9ac8485bd2" Apr 17 16:45:18.695434 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:45:18.695418 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cef52e4350ef905c6defd31a51eb61d48582502ddce56148d8fdbe9ac8485bd2\": container with ID starting with cef52e4350ef905c6defd31a51eb61d48582502ddce56148d8fdbe9ac8485bd2 not found: ID does not exist" containerID="cef52e4350ef905c6defd31a51eb61d48582502ddce56148d8fdbe9ac8485bd2" Apr 17 16:45:18.695471 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.695440 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cef52e4350ef905c6defd31a51eb61d48582502ddce56148d8fdbe9ac8485bd2"} err="failed to get container status \"cef52e4350ef905c6defd31a51eb61d48582502ddce56148d8fdbe9ac8485bd2\": rpc error: code = NotFound desc = could not find container \"cef52e4350ef905c6defd31a51eb61d48582502ddce56148d8fdbe9ac8485bd2\": container with ID starting with cef52e4350ef905c6defd31a51eb61d48582502ddce56148d8fdbe9ac8485bd2 not found: ID does not exist" Apr 17 16:45:18.695471 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.695462 2569 scope.go:117] "RemoveContainer" containerID="94cdc45bf28c449cebb97bcccc6c96f6f12f1cabe4ff377bf766e5b2b57e04f1" Apr 17 16:45:18.695668 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:45:18.695651 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94cdc45bf28c449cebb97bcccc6c96f6f12f1cabe4ff377bf766e5b2b57e04f1\": container with ID starting with 94cdc45bf28c449cebb97bcccc6c96f6f12f1cabe4ff377bf766e5b2b57e04f1 not found: ID does not exist" containerID="94cdc45bf28c449cebb97bcccc6c96f6f12f1cabe4ff377bf766e5b2b57e04f1" Apr 17 16:45:18.695716 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.695670 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94cdc45bf28c449cebb97bcccc6c96f6f12f1cabe4ff377bf766e5b2b57e04f1"} err="failed to get container status \"94cdc45bf28c449cebb97bcccc6c96f6f12f1cabe4ff377bf766e5b2b57e04f1\": rpc error: code = NotFound desc = could not find container \"94cdc45bf28c449cebb97bcccc6c96f6f12f1cabe4ff377bf766e5b2b57e04f1\": container with ID starting with 94cdc45bf28c449cebb97bcccc6c96f6f12f1cabe4ff377bf766e5b2b57e04f1 not found: ID does not exist" Apr 17 16:45:18.695716 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.695682 2569 scope.go:117] "RemoveContainer" containerID="275a17c1a065725861e66ac39428bb0de56e53660ad06a50362f9f292037d3ad" Apr 17 16:45:18.700557 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.700530 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7f33a07-6e34-4ae0-be9e-ee66e6a59b01-kube-api-access-vnnbr" (OuterVolumeSpecName: "kube-api-access-vnnbr") pod "e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" (UID: "e7f33a07-6e34-4ae0-be9e-ee66e6a59b01"). InnerVolumeSpecName "kube-api-access-vnnbr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:45:18.702309 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.702297 2569 scope.go:117] "RemoveContainer" containerID="98ef4b9e6524fdac114d76c2c6cb6fcb5c96492ba57399b3a731c4b286df155c" Apr 17 16:45:18.708786 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.708766 2569 scope.go:117] "RemoveContainer" containerID="f22108bebaec614f153918fb4f27181372b24b37531760f916b9d4f224409dbc" Apr 17 16:45:18.715471 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.715458 2569 scope.go:117] "RemoveContainer" containerID="275a17c1a065725861e66ac39428bb0de56e53660ad06a50362f9f292037d3ad" Apr 17 16:45:18.715679 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:45:18.715664 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"275a17c1a065725861e66ac39428bb0de56e53660ad06a50362f9f292037d3ad\": container with ID starting with 275a17c1a065725861e66ac39428bb0de56e53660ad06a50362f9f292037d3ad not found: ID does not exist" containerID="275a17c1a065725861e66ac39428bb0de56e53660ad06a50362f9f292037d3ad" Apr 17 16:45:18.715730 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.715685 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"275a17c1a065725861e66ac39428bb0de56e53660ad06a50362f9f292037d3ad"} err="failed to get container status \"275a17c1a065725861e66ac39428bb0de56e53660ad06a50362f9f292037d3ad\": rpc error: code = NotFound desc = could not find container \"275a17c1a065725861e66ac39428bb0de56e53660ad06a50362f9f292037d3ad\": container with ID starting with 275a17c1a065725861e66ac39428bb0de56e53660ad06a50362f9f292037d3ad not found: ID does not exist" Apr 17 16:45:18.715730 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.715699 2569 scope.go:117] "RemoveContainer" containerID="98ef4b9e6524fdac114d76c2c6cb6fcb5c96492ba57399b3a731c4b286df155c" Apr 17 16:45:18.715882 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:45:18.715865 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98ef4b9e6524fdac114d76c2c6cb6fcb5c96492ba57399b3a731c4b286df155c\": container with ID starting with 98ef4b9e6524fdac114d76c2c6cb6fcb5c96492ba57399b3a731c4b286df155c not found: ID does not exist" containerID="98ef4b9e6524fdac114d76c2c6cb6fcb5c96492ba57399b3a731c4b286df155c" Apr 17 16:45:18.715923 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.715888 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98ef4b9e6524fdac114d76c2c6cb6fcb5c96492ba57399b3a731c4b286df155c"} err="failed to get container status \"98ef4b9e6524fdac114d76c2c6cb6fcb5c96492ba57399b3a731c4b286df155c\": rpc error: code = NotFound desc = could not find container \"98ef4b9e6524fdac114d76c2c6cb6fcb5c96492ba57399b3a731c4b286df155c\": container with ID starting with 98ef4b9e6524fdac114d76c2c6cb6fcb5c96492ba57399b3a731c4b286df155c not found: ID does not exist" Apr 17 16:45:18.715923 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.715903 2569 scope.go:117] "RemoveContainer" containerID="f22108bebaec614f153918fb4f27181372b24b37531760f916b9d4f224409dbc" Apr 17 16:45:18.716104 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:45:18.716090 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f22108bebaec614f153918fb4f27181372b24b37531760f916b9d4f224409dbc\": container with ID starting with f22108bebaec614f153918fb4f27181372b24b37531760f916b9d4f224409dbc not found: ID does not exist" containerID="f22108bebaec614f153918fb4f27181372b24b37531760f916b9d4f224409dbc" Apr 17 16:45:18.716141 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.716107 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f22108bebaec614f153918fb4f27181372b24b37531760f916b9d4f224409dbc"} err="failed to get container status \"f22108bebaec614f153918fb4f27181372b24b37531760f916b9d4f224409dbc\": rpc error: code = NotFound desc = could not find container \"f22108bebaec614f153918fb4f27181372b24b37531760f916b9d4f224409dbc\": container with ID starting with f22108bebaec614f153918fb4f27181372b24b37531760f916b9d4f224409dbc not found: ID does not exist" Apr 17 16:45:18.778777 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.778745 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7f33a07-6e34-4ae0-be9e-ee66e6a59b01-proxy-tls\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:45:18.778777 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.778774 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7f33a07-6e34-4ae0-be9e-ee66e6a59b01-kserve-provision-location\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:45:18.778927 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.778789 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vnnbr\" (UniqueName: \"kubernetes.io/projected/e7f33a07-6e34-4ae0-be9e-ee66e6a59b01-kube-api-access-vnnbr\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:45:18.778927 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.778807 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-hpa-10815-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e7f33a07-6e34-4ae0-be9e-ee66e6a59b01-isvc-sklearn-graph-raw-hpa-10815-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:45:18.985636 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.985602 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f"] Apr 17 16:45:18.989887 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:18.989860 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-10815-predictor-57dfb49985-zjk5f"] Apr 17 16:45:19.041147 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:19.041112 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" path="/var/lib/kubelet/pods/e7f33a07-6e34-4ae0-be9e-ee66e6a59b01/volumes" Apr 17 16:45:19.041611 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:19.041597 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f113866f-63b7-45d3-bce0-ac0b5a81c137" path="/var/lib/kubelet/pods/f113866f-63b7-45d3-bce0-ac0b5a81c137/volumes" Apr 17 16:45:23.669220 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:23.669193 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr" Apr 17 16:45:24.056471 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.056389 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9"] Apr 17 16:45:24.056714 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.056702 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f113866f-63b7-45d3-bce0-ac0b5a81c137" containerName="storage-initializer" Apr 17 16:45:24.056764 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.056716 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f113866f-63b7-45d3-bce0-ac0b5a81c137" containerName="storage-initializer" Apr 17 16:45:24.056764 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.056731 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f113866f-63b7-45d3-bce0-ac0b5a81c137" containerName="kube-rbac-proxy" Apr 17 16:45:24.056764 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.056737 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f113866f-63b7-45d3-bce0-ac0b5a81c137" containerName="kube-rbac-proxy" Apr 17 16:45:24.056764 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.056744 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f113866f-63b7-45d3-bce0-ac0b5a81c137" containerName="kserve-container" Apr 17 16:45:24.056764 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.056750 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f113866f-63b7-45d3-bce0-ac0b5a81c137" containerName="kserve-container" Apr 17 16:45:24.056764 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.056760 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" containerName="storage-initializer" Apr 17 16:45:24.056764 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.056766 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" containerName="storage-initializer" Apr 17 16:45:24.057015 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.056775 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" containerName="kserve-container" Apr 17 16:45:24.057015 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.056780 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" containerName="kserve-container" Apr 17 16:45:24.057015 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.056786 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" containerName="kube-rbac-proxy" Apr 17 16:45:24.057015 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.056791 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" containerName="kube-rbac-proxy" Apr 17 16:45:24.057015 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.056843 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" containerName="kube-rbac-proxy" Apr 17 16:45:24.057015 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.056851 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f113866f-63b7-45d3-bce0-ac0b5a81c137" containerName="kserve-container" Apr 17 16:45:24.057015 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.056859 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f113866f-63b7-45d3-bce0-ac0b5a81c137" containerName="kube-rbac-proxy" Apr 17 16:45:24.057015 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.056867 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7f33a07-6e34-4ae0-be9e-ee66e6a59b01" containerName="kserve-container" Apr 17 16:45:24.061605 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.061583 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" Apr 17 16:45:24.064223 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.064199 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-cd35e-predictor-serving-cert\"" Apr 17 16:45:24.064458 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.064225 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-cd35e-kube-rbac-proxy-sar-config\"" Apr 17 16:45:24.070295 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.070267 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9"] Apr 17 16:45:24.226583 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.226539 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w7wg\" (UniqueName: \"kubernetes.io/projected/f8b87406-cbce-4e83-8693-c171eb8196b5-kube-api-access-2w7wg\") pod \"isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9\" (UID: \"f8b87406-cbce-4e83-8693-c171eb8196b5\") " pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" Apr 17 16:45:24.226583 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.226591 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8b87406-cbce-4e83-8693-c171eb8196b5-kserve-provision-location\") pod \"isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9\" (UID: \"f8b87406-cbce-4e83-8693-c171eb8196b5\") " pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" Apr 17 16:45:24.226821 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.226663 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-raw-cd35e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f8b87406-cbce-4e83-8693-c171eb8196b5-isvc-logger-raw-cd35e-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9\" (UID: \"f8b87406-cbce-4e83-8693-c171eb8196b5\") " pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" Apr 17 16:45:24.226821 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.226710 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f8b87406-cbce-4e83-8693-c171eb8196b5-proxy-tls\") pod \"isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9\" (UID: \"f8b87406-cbce-4e83-8693-c171eb8196b5\") " pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" Apr 17 16:45:24.328103 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.328002 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2w7wg\" (UniqueName: \"kubernetes.io/projected/f8b87406-cbce-4e83-8693-c171eb8196b5-kube-api-access-2w7wg\") pod \"isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9\" (UID: \"f8b87406-cbce-4e83-8693-c171eb8196b5\") " pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" Apr 17 16:45:24.328103 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.328054 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8b87406-cbce-4e83-8693-c171eb8196b5-kserve-provision-location\") pod \"isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9\" (UID: \"f8b87406-cbce-4e83-8693-c171eb8196b5\") " pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" Apr 17 16:45:24.328103 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.328096 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-raw-cd35e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f8b87406-cbce-4e83-8693-c171eb8196b5-isvc-logger-raw-cd35e-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9\" (UID: \"f8b87406-cbce-4e83-8693-c171eb8196b5\") " pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" Apr 17 16:45:24.328441 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.328141 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f8b87406-cbce-4e83-8693-c171eb8196b5-proxy-tls\") pod \"isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9\" (UID: \"f8b87406-cbce-4e83-8693-c171eb8196b5\") " pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" Apr 17 16:45:24.328528 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.328506 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8b87406-cbce-4e83-8693-c171eb8196b5-kserve-provision-location\") pod \"isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9\" (UID: \"f8b87406-cbce-4e83-8693-c171eb8196b5\") " pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" Apr 17 16:45:24.328811 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.328792 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-raw-cd35e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f8b87406-cbce-4e83-8693-c171eb8196b5-isvc-logger-raw-cd35e-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9\" (UID: \"f8b87406-cbce-4e83-8693-c171eb8196b5\") " pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" Apr 17 16:45:24.330523 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.330499 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f8b87406-cbce-4e83-8693-c171eb8196b5-proxy-tls\") pod \"isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9\" (UID: \"f8b87406-cbce-4e83-8693-c171eb8196b5\") " pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" Apr 17 16:45:24.336718 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.336692 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w7wg\" (UniqueName: \"kubernetes.io/projected/f8b87406-cbce-4e83-8693-c171eb8196b5-kube-api-access-2w7wg\") pod \"isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9\" (UID: \"f8b87406-cbce-4e83-8693-c171eb8196b5\") " pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" Apr 17 16:45:24.372417 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.372390 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" Apr 17 16:45:24.494356 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.494320 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9"] Apr 17 16:45:24.497603 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:45:24.497573 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8b87406_cbce_4e83_8693_c171eb8196b5.slice/crio-2a7aca59f6a180a85aa50f1c40941c79f8c009dc747d492898e01a9d86664b82 WatchSource:0}: Error finding container 2a7aca59f6a180a85aa50f1c40941c79f8c009dc747d492898e01a9d86664b82: Status 404 returned error can't find the container with id 2a7aca59f6a180a85aa50f1c40941c79f8c009dc747d492898e01a9d86664b82 Apr 17 16:45:24.686362 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.686327 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" event={"ID":"f8b87406-cbce-4e83-8693-c171eb8196b5","Type":"ContainerStarted","Data":"6c0ff7048e003ec3a1ae8499227a214355f9dc5d88f8bb48e5702be88ca69481"} Apr 17 16:45:24.686362 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:24.686367 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" event={"ID":"f8b87406-cbce-4e83-8693-c171eb8196b5","Type":"ContainerStarted","Data":"2a7aca59f6a180a85aa50f1c40941c79f8c009dc747d492898e01a9d86664b82"} Apr 17 16:45:28.699938 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:28.699895 2569 generic.go:358] "Generic (PLEG): container finished" podID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerID="6c0ff7048e003ec3a1ae8499227a214355f9dc5d88f8bb48e5702be88ca69481" exitCode=0 Apr 17 16:45:28.700337 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:28.699967 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" event={"ID":"f8b87406-cbce-4e83-8693-c171eb8196b5","Type":"ContainerDied","Data":"6c0ff7048e003ec3a1ae8499227a214355f9dc5d88f8bb48e5702be88ca69481"} Apr 17 16:45:29.704990 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:29.704951 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" event={"ID":"f8b87406-cbce-4e83-8693-c171eb8196b5","Type":"ContainerStarted","Data":"0f762bbe04629f9076b492974c1ac867b5784b35a3a305eecb6dcf48a7a8166a"} Apr 17 16:45:29.704990 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:29.704997 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" event={"ID":"f8b87406-cbce-4e83-8693-c171eb8196b5","Type":"ContainerStarted","Data":"85b6e95da055916f6716024ce3a5d61613a932c1c8cb4ee68c36008468d0544a"} Apr 17 16:45:29.705469 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:29.705008 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" event={"ID":"f8b87406-cbce-4e83-8693-c171eb8196b5","Type":"ContainerStarted","Data":"05526200e625d55c054e8d84f01fbfb7a1d88b46d6711be7c9a0f9151399845a"} Apr 17 16:45:29.705469 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:29.705246 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" Apr 17 16:45:29.726549 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:29.726498 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podStartSLOduration=5.726484483 podStartE2EDuration="5.726484483s" podCreationTimestamp="2026-04-17 16:45:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:45:29.72433588 +0000 UTC m=+827.247361465" watchObservedRunningTime="2026-04-17 16:45:29.726484483 +0000 UTC m=+827.249510067" Apr 17 16:45:30.708244 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:30.708207 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" Apr 17 16:45:30.708244 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:30.708241 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" Apr 17 16:45:30.709655 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:30.709611 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 17 16:45:30.710308 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:30.710280 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:45:31.711063 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:31.711013 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 17 16:45:31.711536 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:31.711513 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:45:36.714870 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:36.714840 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" Apr 17 16:45:36.715526 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:36.715480 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 17 16:45:36.715793 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:36.715767 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:45:46.715923 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:46.715882 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 17 16:45:46.716448 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:46.716431 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:45:56.715528 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:56.715482 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 17 16:45:56.715926 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:45:56.715809 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:46:06.715801 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:06.715754 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 17 16:46:06.716301 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:06.716274 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:46:16.715557 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:16.715512 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 17 16:46:16.716008 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:16.715985 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:46:26.716098 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:26.715992 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 17 16:46:26.716621 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:26.716584 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:46:36.716465 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:36.716430 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" Apr 17 16:46:36.717003 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:36.716958 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" Apr 17 16:46:49.073745 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.073709 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr_3abe04ba-6a27-4d5f-b85d-e7df00cffb69/kserve-container/0.log" Apr 17 16:46:49.264977 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.264944 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9"] Apr 17 16:46:49.265397 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.265356 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="kserve-container" containerID="cri-o://05526200e625d55c054e8d84f01fbfb7a1d88b46d6711be7c9a0f9151399845a" gracePeriod=30 Apr 17 16:46:49.265518 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.265392 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="kube-rbac-proxy" containerID="cri-o://85b6e95da055916f6716024ce3a5d61613a932c1c8cb4ee68c36008468d0544a" gracePeriod=30 Apr 17 16:46:49.265518 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.265392 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="agent" containerID="cri-o://0f762bbe04629f9076b492974c1ac867b5784b35a3a305eecb6dcf48a7a8166a" gracePeriod=30 Apr 17 16:46:49.310815 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.307641 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz"] Apr 17 16:46:49.313131 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.313099 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" Apr 17 16:46:49.315879 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.315859 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-b7bd1-predictor-serving-cert\"" Apr 17 16:46:49.315974 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.315861 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-b7bd1-kube-rbac-proxy-sar-config\"" Apr 17 16:46:49.320879 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.320856 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz"] Apr 17 16:46:49.362523 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.362494 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr"] Apr 17 16:46:49.362792 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.362768 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr" podUID="3abe04ba-6a27-4d5f-b85d-e7df00cffb69" containerName="kserve-container" containerID="cri-o://d7b8d3c14f48197cf849c6fd439958c7347648431cba32c5bc76048aefac730d" gracePeriod=30 Apr 17 16:46:49.362863 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.362799 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr" podUID="3abe04ba-6a27-4d5f-b85d-e7df00cffb69" containerName="kube-rbac-proxy" containerID="cri-o://7ca1c3ecf40c2b00aa22cebfcb203e8bfc1e10073d26fc709b5f3f8688525867" gracePeriod=30 Apr 17 16:46:49.413860 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.413822 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55rbb\" (UniqueName: \"kubernetes.io/projected/99cd5cac-5bbd-4f52-ac82-88b47eeb7183-kube-api-access-55rbb\") pod \"isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz\" (UID: \"99cd5cac-5bbd-4f52-ac82-88b47eeb7183\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" Apr 17 16:46:49.413966 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.413864 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99cd5cac-5bbd-4f52-ac82-88b47eeb7183-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz\" (UID: \"99cd5cac-5bbd-4f52-ac82-88b47eeb7183\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" Apr 17 16:46:49.413966 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.413960 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99cd5cac-5bbd-4f52-ac82-88b47eeb7183-proxy-tls\") pod \"isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz\" (UID: \"99cd5cac-5bbd-4f52-ac82-88b47eeb7183\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" Apr 17 16:46:49.414053 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.413994 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-scale-raw-b7bd1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99cd5cac-5bbd-4f52-ac82-88b47eeb7183-isvc-sklearn-scale-raw-b7bd1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz\" (UID: \"99cd5cac-5bbd-4f52-ac82-88b47eeb7183\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" Apr 17 16:46:49.515371 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.515336 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55rbb\" (UniqueName: \"kubernetes.io/projected/99cd5cac-5bbd-4f52-ac82-88b47eeb7183-kube-api-access-55rbb\") pod \"isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz\" (UID: \"99cd5cac-5bbd-4f52-ac82-88b47eeb7183\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" Apr 17 16:46:49.515542 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.515374 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99cd5cac-5bbd-4f52-ac82-88b47eeb7183-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz\" (UID: \"99cd5cac-5bbd-4f52-ac82-88b47eeb7183\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" Apr 17 16:46:49.515542 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.515466 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99cd5cac-5bbd-4f52-ac82-88b47eeb7183-proxy-tls\") pod \"isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz\" (UID: \"99cd5cac-5bbd-4f52-ac82-88b47eeb7183\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" Apr 17 16:46:49.515542 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.515511 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-scale-raw-b7bd1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99cd5cac-5bbd-4f52-ac82-88b47eeb7183-isvc-sklearn-scale-raw-b7bd1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz\" (UID: \"99cd5cac-5bbd-4f52-ac82-88b47eeb7183\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" Apr 17 16:46:49.515714 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:46:49.515635 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-serving-cert: secret "isvc-sklearn-scale-raw-b7bd1-predictor-serving-cert" not found Apr 17 16:46:49.515714 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:46:49.515707 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99cd5cac-5bbd-4f52-ac82-88b47eeb7183-proxy-tls podName:99cd5cac-5bbd-4f52-ac82-88b47eeb7183 nodeName:}" failed. No retries permitted until 2026-04-17 16:46:50.015690062 +0000 UTC m=+907.538715626 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/99cd5cac-5bbd-4f52-ac82-88b47eeb7183-proxy-tls") pod "isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" (UID: "99cd5cac-5bbd-4f52-ac82-88b47eeb7183") : secret "isvc-sklearn-scale-raw-b7bd1-predictor-serving-cert" not found Apr 17 16:46:49.515850 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.515825 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99cd5cac-5bbd-4f52-ac82-88b47eeb7183-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz\" (UID: \"99cd5cac-5bbd-4f52-ac82-88b47eeb7183\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" Apr 17 16:46:49.516101 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.516084 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-scale-raw-b7bd1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99cd5cac-5bbd-4f52-ac82-88b47eeb7183-isvc-sklearn-scale-raw-b7bd1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz\" (UID: \"99cd5cac-5bbd-4f52-ac82-88b47eeb7183\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" Apr 17 16:46:49.524378 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.524355 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-55rbb\" (UniqueName: \"kubernetes.io/projected/99cd5cac-5bbd-4f52-ac82-88b47eeb7183-kube-api-access-55rbb\") pod \"isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz\" (UID: \"99cd5cac-5bbd-4f52-ac82-88b47eeb7183\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" Apr 17 16:46:49.602473 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.602451 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr" Apr 17 16:46:49.717866 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.717773 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-raw-cd35e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3abe04ba-6a27-4d5f-b85d-e7df00cffb69-message-dumper-raw-cd35e-kube-rbac-proxy-sar-config\") pod \"3abe04ba-6a27-4d5f-b85d-e7df00cffb69\" (UID: \"3abe04ba-6a27-4d5f-b85d-e7df00cffb69\") " Apr 17 16:46:49.717866 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.717811 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zllb6\" (UniqueName: \"kubernetes.io/projected/3abe04ba-6a27-4d5f-b85d-e7df00cffb69-kube-api-access-zllb6\") pod \"3abe04ba-6a27-4d5f-b85d-e7df00cffb69\" (UID: \"3abe04ba-6a27-4d5f-b85d-e7df00cffb69\") " Apr 17 16:46:49.717866 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.717867 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3abe04ba-6a27-4d5f-b85d-e7df00cffb69-proxy-tls\") pod \"3abe04ba-6a27-4d5f-b85d-e7df00cffb69\" (UID: \"3abe04ba-6a27-4d5f-b85d-e7df00cffb69\") " Apr 17 16:46:49.718233 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.718206 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3abe04ba-6a27-4d5f-b85d-e7df00cffb69-message-dumper-raw-cd35e-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-raw-cd35e-kube-rbac-proxy-sar-config") pod "3abe04ba-6a27-4d5f-b85d-e7df00cffb69" (UID: "3abe04ba-6a27-4d5f-b85d-e7df00cffb69"). InnerVolumeSpecName "message-dumper-raw-cd35e-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:46:49.719959 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.719940 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3abe04ba-6a27-4d5f-b85d-e7df00cffb69-kube-api-access-zllb6" (OuterVolumeSpecName: "kube-api-access-zllb6") pod "3abe04ba-6a27-4d5f-b85d-e7df00cffb69" (UID: "3abe04ba-6a27-4d5f-b85d-e7df00cffb69"). InnerVolumeSpecName "kube-api-access-zllb6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:46:49.720083 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.720068 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3abe04ba-6a27-4d5f-b85d-e7df00cffb69-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3abe04ba-6a27-4d5f-b85d-e7df00cffb69" (UID: "3abe04ba-6a27-4d5f-b85d-e7df00cffb69"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:46:49.818719 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.818672 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3abe04ba-6a27-4d5f-b85d-e7df00cffb69-proxy-tls\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:46:49.818719 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.818713 2569 reconciler_common.go:299] "Volume detached for volume \"message-dumper-raw-cd35e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3abe04ba-6a27-4d5f-b85d-e7df00cffb69-message-dumper-raw-cd35e-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:46:49.818719 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.818725 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zllb6\" (UniqueName: \"kubernetes.io/projected/3abe04ba-6a27-4d5f-b85d-e7df00cffb69-kube-api-access-zllb6\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:46:49.951779 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.951733 2569 generic.go:358] "Generic (PLEG): container finished" podID="3abe04ba-6a27-4d5f-b85d-e7df00cffb69" containerID="7ca1c3ecf40c2b00aa22cebfcb203e8bfc1e10073d26fc709b5f3f8688525867" exitCode=2 Apr 17 16:46:49.951779 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.951764 2569 generic.go:358] "Generic (PLEG): container finished" podID="3abe04ba-6a27-4d5f-b85d-e7df00cffb69" containerID="d7b8d3c14f48197cf849c6fd439958c7347648431cba32c5bc76048aefac730d" exitCode=2 Apr 17 16:46:49.952018 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.951811 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr" Apr 17 16:46:49.952018 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.951822 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr" event={"ID":"3abe04ba-6a27-4d5f-b85d-e7df00cffb69","Type":"ContainerDied","Data":"7ca1c3ecf40c2b00aa22cebfcb203e8bfc1e10073d26fc709b5f3f8688525867"} Apr 17 16:46:49.952018 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.951860 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr" event={"ID":"3abe04ba-6a27-4d5f-b85d-e7df00cffb69","Type":"ContainerDied","Data":"d7b8d3c14f48197cf849c6fd439958c7347648431cba32c5bc76048aefac730d"} Apr 17 16:46:49.952018 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.951873 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr" event={"ID":"3abe04ba-6a27-4d5f-b85d-e7df00cffb69","Type":"ContainerDied","Data":"86bc7f7484a3560e63e14f1405ca4ac6e63983306db8e4eaee2afa3676db58ee"} Apr 17 16:46:49.952018 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.951894 2569 scope.go:117] "RemoveContainer" containerID="7ca1c3ecf40c2b00aa22cebfcb203e8bfc1e10073d26fc709b5f3f8688525867" Apr 17 16:46:49.954069 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.954043 2569 generic.go:358] "Generic (PLEG): container finished" podID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerID="85b6e95da055916f6716024ce3a5d61613a932c1c8cb4ee68c36008468d0544a" exitCode=2 Apr 17 16:46:49.954179 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.954113 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" event={"ID":"f8b87406-cbce-4e83-8693-c171eb8196b5","Type":"ContainerDied","Data":"85b6e95da055916f6716024ce3a5d61613a932c1c8cb4ee68c36008468d0544a"} Apr 17 16:46:49.959923 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.959907 2569 scope.go:117] "RemoveContainer" containerID="d7b8d3c14f48197cf849c6fd439958c7347648431cba32c5bc76048aefac730d" Apr 17 16:46:49.967002 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.966985 2569 scope.go:117] "RemoveContainer" containerID="7ca1c3ecf40c2b00aa22cebfcb203e8bfc1e10073d26fc709b5f3f8688525867" Apr 17 16:46:49.967268 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:46:49.967233 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ca1c3ecf40c2b00aa22cebfcb203e8bfc1e10073d26fc709b5f3f8688525867\": container with ID starting with 7ca1c3ecf40c2b00aa22cebfcb203e8bfc1e10073d26fc709b5f3f8688525867 not found: ID does not exist" containerID="7ca1c3ecf40c2b00aa22cebfcb203e8bfc1e10073d26fc709b5f3f8688525867" Apr 17 16:46:49.967324 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.967282 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ca1c3ecf40c2b00aa22cebfcb203e8bfc1e10073d26fc709b5f3f8688525867"} err="failed to get container status \"7ca1c3ecf40c2b00aa22cebfcb203e8bfc1e10073d26fc709b5f3f8688525867\": rpc error: code = NotFound desc = could not find container \"7ca1c3ecf40c2b00aa22cebfcb203e8bfc1e10073d26fc709b5f3f8688525867\": container with ID starting with 7ca1c3ecf40c2b00aa22cebfcb203e8bfc1e10073d26fc709b5f3f8688525867 not found: ID does not exist" Apr 17 16:46:49.967324 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.967305 2569 scope.go:117] "RemoveContainer" containerID="d7b8d3c14f48197cf849c6fd439958c7347648431cba32c5bc76048aefac730d" Apr 17 16:46:49.967546 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:46:49.967529 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7b8d3c14f48197cf849c6fd439958c7347648431cba32c5bc76048aefac730d\": container with ID starting with d7b8d3c14f48197cf849c6fd439958c7347648431cba32c5bc76048aefac730d not found: ID does not exist" containerID="d7b8d3c14f48197cf849c6fd439958c7347648431cba32c5bc76048aefac730d" Apr 17 16:46:49.967608 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.967554 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7b8d3c14f48197cf849c6fd439958c7347648431cba32c5bc76048aefac730d"} err="failed to get container status \"d7b8d3c14f48197cf849c6fd439958c7347648431cba32c5bc76048aefac730d\": rpc error: code = NotFound desc = could not find container \"d7b8d3c14f48197cf849c6fd439958c7347648431cba32c5bc76048aefac730d\": container with ID starting with d7b8d3c14f48197cf849c6fd439958c7347648431cba32c5bc76048aefac730d not found: ID does not exist" Apr 17 16:46:49.967608 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.967575 2569 scope.go:117] "RemoveContainer" containerID="7ca1c3ecf40c2b00aa22cebfcb203e8bfc1e10073d26fc709b5f3f8688525867" Apr 17 16:46:49.967783 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.967765 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ca1c3ecf40c2b00aa22cebfcb203e8bfc1e10073d26fc709b5f3f8688525867"} err="failed to get container status \"7ca1c3ecf40c2b00aa22cebfcb203e8bfc1e10073d26fc709b5f3f8688525867\": rpc error: code = NotFound desc = could not find container \"7ca1c3ecf40c2b00aa22cebfcb203e8bfc1e10073d26fc709b5f3f8688525867\": container with ID starting with 7ca1c3ecf40c2b00aa22cebfcb203e8bfc1e10073d26fc709b5f3f8688525867 not found: ID does not exist" Apr 17 16:46:49.967828 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.967785 2569 scope.go:117] "RemoveContainer" containerID="d7b8d3c14f48197cf849c6fd439958c7347648431cba32c5bc76048aefac730d" Apr 17 16:46:49.968016 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.967973 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7b8d3c14f48197cf849c6fd439958c7347648431cba32c5bc76048aefac730d"} err="failed to get container status \"d7b8d3c14f48197cf849c6fd439958c7347648431cba32c5bc76048aefac730d\": rpc error: code = NotFound desc = could not find container \"d7b8d3c14f48197cf849c6fd439958c7347648431cba32c5bc76048aefac730d\": container with ID starting with d7b8d3c14f48197cf849c6fd439958c7347648431cba32c5bc76048aefac730d not found: ID does not exist" Apr 17 16:46:49.972417 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.972394 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr"] Apr 17 16:46:49.976772 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:49.976754 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-cd35e-predictor-b88c449b8-w9kjr"] Apr 17 16:46:50.020162 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:50.020139 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99cd5cac-5bbd-4f52-ac82-88b47eeb7183-proxy-tls\") pod \"isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz\" (UID: \"99cd5cac-5bbd-4f52-ac82-88b47eeb7183\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" Apr 17 16:46:50.022483 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:50.022460 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99cd5cac-5bbd-4f52-ac82-88b47eeb7183-proxy-tls\") pod \"isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz\" (UID: \"99cd5cac-5bbd-4f52-ac82-88b47eeb7183\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" Apr 17 16:46:50.224351 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:50.224263 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" Apr 17 16:46:50.345242 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:50.345214 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz"] Apr 17 16:46:50.346979 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:46:50.346951 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99cd5cac_5bbd_4f52_ac82_88b47eeb7183.slice/crio-4c9ec654b984df3b4a6f1790a99bafbda5f03cb23717db1752f75089463a7a16 WatchSource:0}: Error finding container 4c9ec654b984df3b4a6f1790a99bafbda5f03cb23717db1752f75089463a7a16: Status 404 returned error can't find the container with id 4c9ec654b984df3b4a6f1790a99bafbda5f03cb23717db1752f75089463a7a16 Apr 17 16:46:50.958556 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:50.958515 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" event={"ID":"99cd5cac-5bbd-4f52-ac82-88b47eeb7183","Type":"ContainerStarted","Data":"944717c4a48e6706e5ef993587764d660f5cf9884a299e03eb9a0aa7dcd76434"} Apr 17 16:46:50.958556 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:50.958560 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" event={"ID":"99cd5cac-5bbd-4f52-ac82-88b47eeb7183","Type":"ContainerStarted","Data":"4c9ec654b984df3b4a6f1790a99bafbda5f03cb23717db1752f75089463a7a16"} Apr 17 16:46:51.040770 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:51.040736 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3abe04ba-6a27-4d5f-b85d-e7df00cffb69" path="/var/lib/kubelet/pods/3abe04ba-6a27-4d5f-b85d-e7df00cffb69/volumes" Apr 17 16:46:51.712242 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:51.712200 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.26:8643/healthz\": dial tcp 10.134.0.26:8643: connect: connection refused" Apr 17 16:46:53.971977 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:53.971887 2569 generic.go:358] "Generic (PLEG): container finished" podID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerID="944717c4a48e6706e5ef993587764d660f5cf9884a299e03eb9a0aa7dcd76434" exitCode=0 Apr 17 16:46:53.971977 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:53.971961 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" event={"ID":"99cd5cac-5bbd-4f52-ac82-88b47eeb7183","Type":"ContainerDied","Data":"944717c4a48e6706e5ef993587764d660f5cf9884a299e03eb9a0aa7dcd76434"} Apr 17 16:46:53.974186 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:53.974162 2569 generic.go:358] "Generic (PLEG): container finished" podID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerID="05526200e625d55c054e8d84f01fbfb7a1d88b46d6711be7c9a0f9151399845a" exitCode=0 Apr 17 16:46:53.974305 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:53.974224 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" event={"ID":"f8b87406-cbce-4e83-8693-c171eb8196b5","Type":"ContainerDied","Data":"05526200e625d55c054e8d84f01fbfb7a1d88b46d6711be7c9a0f9151399845a"} Apr 17 16:46:54.983914 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:54.983873 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" event={"ID":"99cd5cac-5bbd-4f52-ac82-88b47eeb7183","Type":"ContainerStarted","Data":"334b862da05ac9ff3d6b3dec2c6494aad6eb33459a56fc5109dfb2940e05e80f"} Apr 17 16:46:54.983914 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:54.983918 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" event={"ID":"99cd5cac-5bbd-4f52-ac82-88b47eeb7183","Type":"ContainerStarted","Data":"5899a8b9b868f8c68c91e566b09ce1694b8d02a3dfbfbe86ed9b5c39023d0b08"} Apr 17 16:46:54.984469 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:54.984169 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" Apr 17 16:46:55.002668 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:55.002619 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" podStartSLOduration=6.002601316 podStartE2EDuration="6.002601316s" podCreationTimestamp="2026-04-17 16:46:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:46:55.002004191 +0000 UTC m=+912.525029769" watchObservedRunningTime="2026-04-17 16:46:55.002601316 +0000 UTC m=+912.525626903" Apr 17 16:46:55.987340 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:55.987307 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" Apr 17 16:46:55.988590 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:55.988561 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 17 16:46:56.711997 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:56.711950 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.26:8643/healthz\": dial tcp 10.134.0.26:8643: connect: connection refused" Apr 17 16:46:56.716348 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:56.716319 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 17 16:46:56.717893 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:56.717860 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:46:56.991046 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:46:56.990955 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 17 16:47:01.711590 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:01.711531 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.26:8643/healthz\": dial tcp 10.134.0.26:8643: connect: connection refused" Apr 17 16:47:01.712087 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:01.711673 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" Apr 17 16:47:01.995619 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:01.995539 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" Apr 17 16:47:01.996168 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:01.996137 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 17 16:47:06.711612 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:06.711571 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.26:8643/healthz\": dial tcp 10.134.0.26:8643: connect: connection refused" Apr 17 16:47:06.715910 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:06.715881 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 17 16:47:06.717462 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:06.717434 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:47:11.711241 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:11.711180 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.26:8643/healthz\": dial tcp 10.134.0.26:8643: connect: connection refused" Apr 17 16:47:11.996276 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:11.996144 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 17 16:47:16.711623 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:16.711581 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.26:8643/healthz\": dial tcp 10.134.0.26:8643: connect: connection refused" Apr 17 16:47:16.716326 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:16.716296 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 17 16:47:16.716422 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:16.716410 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" Apr 17 16:47:16.717756 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:16.717733 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:47:16.717848 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:16.717838 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" Apr 17 16:47:19.403467 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:19.403442 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" Apr 17 16:47:19.458941 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:19.458908 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f8b87406-cbce-4e83-8693-c171eb8196b5-proxy-tls\") pod \"f8b87406-cbce-4e83-8693-c171eb8196b5\" (UID: \"f8b87406-cbce-4e83-8693-c171eb8196b5\") " Apr 17 16:47:19.459116 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:19.458967 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-raw-cd35e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f8b87406-cbce-4e83-8693-c171eb8196b5-isvc-logger-raw-cd35e-kube-rbac-proxy-sar-config\") pod \"f8b87406-cbce-4e83-8693-c171eb8196b5\" (UID: \"f8b87406-cbce-4e83-8693-c171eb8196b5\") " Apr 17 16:47:19.459116 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:19.458998 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8b87406-cbce-4e83-8693-c171eb8196b5-kserve-provision-location\") pod \"f8b87406-cbce-4e83-8693-c171eb8196b5\" (UID: \"f8b87406-cbce-4e83-8693-c171eb8196b5\") " Apr 17 16:47:19.459116 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:19.459027 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w7wg\" (UniqueName: \"kubernetes.io/projected/f8b87406-cbce-4e83-8693-c171eb8196b5-kube-api-access-2w7wg\") pod \"f8b87406-cbce-4e83-8693-c171eb8196b5\" (UID: \"f8b87406-cbce-4e83-8693-c171eb8196b5\") " Apr 17 16:47:19.459416 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:19.459386 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8b87406-cbce-4e83-8693-c171eb8196b5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f8b87406-cbce-4e83-8693-c171eb8196b5" (UID: "f8b87406-cbce-4e83-8693-c171eb8196b5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:47:19.459416 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:19.459387 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8b87406-cbce-4e83-8693-c171eb8196b5-isvc-logger-raw-cd35e-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-raw-cd35e-kube-rbac-proxy-sar-config") pod "f8b87406-cbce-4e83-8693-c171eb8196b5" (UID: "f8b87406-cbce-4e83-8693-c171eb8196b5"). InnerVolumeSpecName "isvc-logger-raw-cd35e-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:47:19.461053 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:19.461027 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8b87406-cbce-4e83-8693-c171eb8196b5-kube-api-access-2w7wg" (OuterVolumeSpecName: "kube-api-access-2w7wg") pod "f8b87406-cbce-4e83-8693-c171eb8196b5" (UID: "f8b87406-cbce-4e83-8693-c171eb8196b5"). InnerVolumeSpecName "kube-api-access-2w7wg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:47:19.461140 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:19.461079 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8b87406-cbce-4e83-8693-c171eb8196b5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f8b87406-cbce-4e83-8693-c171eb8196b5" (UID: "f8b87406-cbce-4e83-8693-c171eb8196b5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:47:19.559666 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:19.559568 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2w7wg\" (UniqueName: \"kubernetes.io/projected/f8b87406-cbce-4e83-8693-c171eb8196b5-kube-api-access-2w7wg\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:47:19.559666 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:19.559613 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f8b87406-cbce-4e83-8693-c171eb8196b5-proxy-tls\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:47:19.559666 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:19.559626 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-raw-cd35e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f8b87406-cbce-4e83-8693-c171eb8196b5-isvc-logger-raw-cd35e-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:47:19.559666 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:19.559636 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8b87406-cbce-4e83-8693-c171eb8196b5-kserve-provision-location\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:47:20.064220 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:20.064185 2569 generic.go:358] "Generic (PLEG): container finished" podID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerID="0f762bbe04629f9076b492974c1ac867b5784b35a3a305eecb6dcf48a7a8166a" exitCode=0 Apr 17 16:47:20.064416 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:20.064299 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" Apr 17 16:47:20.064416 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:20.064290 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" event={"ID":"f8b87406-cbce-4e83-8693-c171eb8196b5","Type":"ContainerDied","Data":"0f762bbe04629f9076b492974c1ac867b5784b35a3a305eecb6dcf48a7a8166a"} Apr 17 16:47:20.064416 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:20.064409 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9" event={"ID":"f8b87406-cbce-4e83-8693-c171eb8196b5","Type":"ContainerDied","Data":"2a7aca59f6a180a85aa50f1c40941c79f8c009dc747d492898e01a9d86664b82"} Apr 17 16:47:20.064522 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:20.064428 2569 scope.go:117] "RemoveContainer" containerID="0f762bbe04629f9076b492974c1ac867b5784b35a3a305eecb6dcf48a7a8166a" Apr 17 16:47:20.073186 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:20.073168 2569 scope.go:117] "RemoveContainer" containerID="85b6e95da055916f6716024ce3a5d61613a932c1c8cb4ee68c36008468d0544a" Apr 17 16:47:20.080460 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:20.080439 2569 scope.go:117] "RemoveContainer" containerID="05526200e625d55c054e8d84f01fbfb7a1d88b46d6711be7c9a0f9151399845a" Apr 17 16:47:20.086310 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:20.086284 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9"] Apr 17 16:47:20.088870 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:20.088842 2569 scope.go:117] "RemoveContainer" containerID="6c0ff7048e003ec3a1ae8499227a214355f9dc5d88f8bb48e5702be88ca69481" Apr 17 16:47:20.090962 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:20.090940 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-cd35e-predictor-8448c47648-kjwr9"] Apr 17 16:47:20.095798 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:20.095781 2569 scope.go:117] "RemoveContainer" containerID="0f762bbe04629f9076b492974c1ac867b5784b35a3a305eecb6dcf48a7a8166a" Apr 17 16:47:20.096020 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:47:20.096002 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f762bbe04629f9076b492974c1ac867b5784b35a3a305eecb6dcf48a7a8166a\": container with ID starting with 0f762bbe04629f9076b492974c1ac867b5784b35a3a305eecb6dcf48a7a8166a not found: ID does not exist" containerID="0f762bbe04629f9076b492974c1ac867b5784b35a3a305eecb6dcf48a7a8166a" Apr 17 16:47:20.096075 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:20.096031 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f762bbe04629f9076b492974c1ac867b5784b35a3a305eecb6dcf48a7a8166a"} err="failed to get container status \"0f762bbe04629f9076b492974c1ac867b5784b35a3a305eecb6dcf48a7a8166a\": rpc error: code = NotFound desc = could not find container \"0f762bbe04629f9076b492974c1ac867b5784b35a3a305eecb6dcf48a7a8166a\": container with ID starting with 0f762bbe04629f9076b492974c1ac867b5784b35a3a305eecb6dcf48a7a8166a not found: ID does not exist" Apr 17 16:47:20.096075 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:20.096050 2569 scope.go:117] "RemoveContainer" containerID="85b6e95da055916f6716024ce3a5d61613a932c1c8cb4ee68c36008468d0544a" Apr 17 16:47:20.096306 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:47:20.096283 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85b6e95da055916f6716024ce3a5d61613a932c1c8cb4ee68c36008468d0544a\": container with ID starting with 85b6e95da055916f6716024ce3a5d61613a932c1c8cb4ee68c36008468d0544a not found: ID does not exist" containerID="85b6e95da055916f6716024ce3a5d61613a932c1c8cb4ee68c36008468d0544a" Apr 17 16:47:20.096377 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:20.096317 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85b6e95da055916f6716024ce3a5d61613a932c1c8cb4ee68c36008468d0544a"} err="failed to get container status \"85b6e95da055916f6716024ce3a5d61613a932c1c8cb4ee68c36008468d0544a\": rpc error: code = NotFound desc = could not find container \"85b6e95da055916f6716024ce3a5d61613a932c1c8cb4ee68c36008468d0544a\": container with ID starting with 85b6e95da055916f6716024ce3a5d61613a932c1c8cb4ee68c36008468d0544a not found: ID does not exist" Apr 17 16:47:20.096377 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:20.096339 2569 scope.go:117] "RemoveContainer" containerID="05526200e625d55c054e8d84f01fbfb7a1d88b46d6711be7c9a0f9151399845a" Apr 17 16:47:20.096595 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:47:20.096576 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05526200e625d55c054e8d84f01fbfb7a1d88b46d6711be7c9a0f9151399845a\": container with ID starting with 05526200e625d55c054e8d84f01fbfb7a1d88b46d6711be7c9a0f9151399845a not found: ID does not exist" containerID="05526200e625d55c054e8d84f01fbfb7a1d88b46d6711be7c9a0f9151399845a" Apr 17 16:47:20.096631 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:20.096600 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05526200e625d55c054e8d84f01fbfb7a1d88b46d6711be7c9a0f9151399845a"} err="failed to get container status \"05526200e625d55c054e8d84f01fbfb7a1d88b46d6711be7c9a0f9151399845a\": rpc error: code = NotFound desc = could not find container \"05526200e625d55c054e8d84f01fbfb7a1d88b46d6711be7c9a0f9151399845a\": container with ID starting with 05526200e625d55c054e8d84f01fbfb7a1d88b46d6711be7c9a0f9151399845a not found: ID does not exist" Apr 17 16:47:20.096631 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:20.096614 2569 scope.go:117] "RemoveContainer" containerID="6c0ff7048e003ec3a1ae8499227a214355f9dc5d88f8bb48e5702be88ca69481" Apr 17 16:47:20.096828 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:47:20.096810 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c0ff7048e003ec3a1ae8499227a214355f9dc5d88f8bb48e5702be88ca69481\": container with ID starting with 6c0ff7048e003ec3a1ae8499227a214355f9dc5d88f8bb48e5702be88ca69481 not found: ID does not exist" containerID="6c0ff7048e003ec3a1ae8499227a214355f9dc5d88f8bb48e5702be88ca69481" Apr 17 16:47:20.096869 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:20.096833 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c0ff7048e003ec3a1ae8499227a214355f9dc5d88f8bb48e5702be88ca69481"} err="failed to get container status \"6c0ff7048e003ec3a1ae8499227a214355f9dc5d88f8bb48e5702be88ca69481\": rpc error: code = NotFound desc = could not find container \"6c0ff7048e003ec3a1ae8499227a214355f9dc5d88f8bb48e5702be88ca69481\": container with ID starting with 6c0ff7048e003ec3a1ae8499227a214355f9dc5d88f8bb48e5702be88ca69481 not found: ID does not exist" Apr 17 16:47:21.040990 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:21.040952 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" path="/var/lib/kubelet/pods/f8b87406-cbce-4e83-8693-c171eb8196b5/volumes" Apr 17 16:47:21.996421 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:21.996383 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 17 16:47:31.996220 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:31.996178 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 17 16:47:41.996109 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:41.996064 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 17 16:47:51.996270 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:47:51.996217 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 17 16:48:01.996836 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:48:01.996741 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 17 16:48:11.996476 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:48:11.996435 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 17 16:48:19.037269 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:48:19.037212 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 17 16:48:29.038109 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:48:29.038054 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 17 16:48:39.037448 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:48:39.037397 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 17 16:48:49.037702 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:48:49.037649 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 17 16:48:59.037130 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:48:59.037086 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 17 16:49:09.037649 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:09.037607 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 17 16:49:19.041292 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.041240 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" Apr 17 16:49:19.487133 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.487096 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz"] Apr 17 16:49:19.487479 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.487456 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="kserve-container" containerID="cri-o://5899a8b9b868f8c68c91e566b09ce1694b8d02a3dfbfbe86ed9b5c39023d0b08" gracePeriod=30 Apr 17 16:49:19.487550 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.487503 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="kube-rbac-proxy" containerID="cri-o://334b862da05ac9ff3d6b3dec2c6494aad6eb33459a56fc5109dfb2940e05e80f" gracePeriod=30 Apr 17 16:49:19.609810 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.609778 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc"] Apr 17 16:49:19.610106 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.610094 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3abe04ba-6a27-4d5f-b85d-e7df00cffb69" containerName="kube-rbac-proxy" Apr 17 16:49:19.610153 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.610107 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3abe04ba-6a27-4d5f-b85d-e7df00cffb69" containerName="kube-rbac-proxy" Apr 17 16:49:19.610153 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.610119 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="agent" Apr 17 16:49:19.610153 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.610125 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="agent" Apr 17 16:49:19.610153 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.610134 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="kube-rbac-proxy" Apr 17 16:49:19.610153 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.610140 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="kube-rbac-proxy" Apr 17 16:49:19.610317 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.610159 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="kserve-container" Apr 17 16:49:19.610317 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.610164 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="kserve-container" Apr 17 16:49:19.610317 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.610172 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3abe04ba-6a27-4d5f-b85d-e7df00cffb69" containerName="kserve-container" Apr 17 16:49:19.610317 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.610180 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3abe04ba-6a27-4d5f-b85d-e7df00cffb69" containerName="kserve-container" Apr 17 16:49:19.610317 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.610196 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="storage-initializer" Apr 17 16:49:19.610317 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.610203 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="storage-initializer" Apr 17 16:49:19.610317 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.610267 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="agent" Apr 17 16:49:19.610317 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.610278 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="3abe04ba-6a27-4d5f-b85d-e7df00cffb69" containerName="kube-rbac-proxy" Apr 17 16:49:19.610317 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.610284 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="kserve-container" Apr 17 16:49:19.610317 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.610291 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8b87406-cbce-4e83-8693-c171eb8196b5" containerName="kube-rbac-proxy" Apr 17 16:49:19.610317 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.610298 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="3abe04ba-6a27-4d5f-b85d-e7df00cffb69" containerName="kserve-container" Apr 17 16:49:19.613236 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.613216 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" Apr 17 16:49:19.615599 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.615581 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-c62bcc-predictor-serving-cert\"" Apr 17 16:49:19.615706 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.615609 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-c62bcc-kube-rbac-proxy-sar-config\"" Apr 17 16:49:19.622479 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.622460 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc"] Apr 17 16:49:19.669981 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.669948 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzpw7\" (UniqueName: \"kubernetes.io/projected/e8fc05a8-d184-4066-96e9-593eeddd8107-kube-api-access-bzpw7\") pod \"isvc-primary-c62bcc-predictor-75f99866db-zhgfc\" (UID: \"e8fc05a8-d184-4066-96e9-593eeddd8107\") " pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" Apr 17 16:49:19.670107 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.670004 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8fc05a8-d184-4066-96e9-593eeddd8107-kserve-provision-location\") pod \"isvc-primary-c62bcc-predictor-75f99866db-zhgfc\" (UID: \"e8fc05a8-d184-4066-96e9-593eeddd8107\") " pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" Apr 17 16:49:19.670107 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.670063 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8fc05a8-d184-4066-96e9-593eeddd8107-proxy-tls\") pod \"isvc-primary-c62bcc-predictor-75f99866db-zhgfc\" (UID: \"e8fc05a8-d184-4066-96e9-593eeddd8107\") " pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" Apr 17 16:49:19.670223 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.670146 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-c62bcc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e8fc05a8-d184-4066-96e9-593eeddd8107-isvc-primary-c62bcc-kube-rbac-proxy-sar-config\") pod \"isvc-primary-c62bcc-predictor-75f99866db-zhgfc\" (UID: \"e8fc05a8-d184-4066-96e9-593eeddd8107\") " pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" Apr 17 16:49:19.770825 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.770735 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8fc05a8-d184-4066-96e9-593eeddd8107-kserve-provision-location\") pod \"isvc-primary-c62bcc-predictor-75f99866db-zhgfc\" (UID: \"e8fc05a8-d184-4066-96e9-593eeddd8107\") " pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" Apr 17 16:49:19.770825 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.770792 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8fc05a8-d184-4066-96e9-593eeddd8107-proxy-tls\") pod \"isvc-primary-c62bcc-predictor-75f99866db-zhgfc\" (UID: \"e8fc05a8-d184-4066-96e9-593eeddd8107\") " pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" Apr 17 16:49:19.771024 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.770857 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-c62bcc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e8fc05a8-d184-4066-96e9-593eeddd8107-isvc-primary-c62bcc-kube-rbac-proxy-sar-config\") pod \"isvc-primary-c62bcc-predictor-75f99866db-zhgfc\" (UID: \"e8fc05a8-d184-4066-96e9-593eeddd8107\") " pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" Apr 17 16:49:19.771024 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.770915 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzpw7\" (UniqueName: \"kubernetes.io/projected/e8fc05a8-d184-4066-96e9-593eeddd8107-kube-api-access-bzpw7\") pod \"isvc-primary-c62bcc-predictor-75f99866db-zhgfc\" (UID: \"e8fc05a8-d184-4066-96e9-593eeddd8107\") " pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" Apr 17 16:49:19.771197 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.771173 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8fc05a8-d184-4066-96e9-593eeddd8107-kserve-provision-location\") pod \"isvc-primary-c62bcc-predictor-75f99866db-zhgfc\" (UID: \"e8fc05a8-d184-4066-96e9-593eeddd8107\") " pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" Apr 17 16:49:19.771570 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.771545 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-c62bcc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e8fc05a8-d184-4066-96e9-593eeddd8107-isvc-primary-c62bcc-kube-rbac-proxy-sar-config\") pod \"isvc-primary-c62bcc-predictor-75f99866db-zhgfc\" (UID: \"e8fc05a8-d184-4066-96e9-593eeddd8107\") " pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" Apr 17 16:49:19.773338 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.773322 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8fc05a8-d184-4066-96e9-593eeddd8107-proxy-tls\") pod \"isvc-primary-c62bcc-predictor-75f99866db-zhgfc\" (UID: \"e8fc05a8-d184-4066-96e9-593eeddd8107\") " pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" Apr 17 16:49:19.779392 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.779368 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzpw7\" (UniqueName: \"kubernetes.io/projected/e8fc05a8-d184-4066-96e9-593eeddd8107-kube-api-access-bzpw7\") pod \"isvc-primary-c62bcc-predictor-75f99866db-zhgfc\" (UID: \"e8fc05a8-d184-4066-96e9-593eeddd8107\") " pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" Apr 17 16:49:19.924357 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:19.924321 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" Apr 17 16:49:20.044077 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:20.044000 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc"] Apr 17 16:49:20.046804 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:49:20.046775 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8fc05a8_d184_4066_96e9_593eeddd8107.slice/crio-5912e8de32209fd28b9166e418d574a3dec4dbb16acca390adbee0ecb3010b7e WatchSource:0}: Error finding container 5912e8de32209fd28b9166e418d574a3dec4dbb16acca390adbee0ecb3010b7e: Status 404 returned error can't find the container with id 5912e8de32209fd28b9166e418d574a3dec4dbb16acca390adbee0ecb3010b7e Apr 17 16:49:20.421986 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:20.421944 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" event={"ID":"e8fc05a8-d184-4066-96e9-593eeddd8107","Type":"ContainerStarted","Data":"ebd9801242116b714737a9ffb7d4172fbb46409c37251929cf48f34d56fe4574"} Apr 17 16:49:20.421986 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:20.421993 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" event={"ID":"e8fc05a8-d184-4066-96e9-593eeddd8107","Type":"ContainerStarted","Data":"5912e8de32209fd28b9166e418d574a3dec4dbb16acca390adbee0ecb3010b7e"} Apr 17 16:49:20.424059 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:20.424032 2569 generic.go:358] "Generic (PLEG): container finished" podID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerID="334b862da05ac9ff3d6b3dec2c6494aad6eb33459a56fc5109dfb2940e05e80f" exitCode=2 Apr 17 16:49:20.424179 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:20.424090 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" event={"ID":"99cd5cac-5bbd-4f52-ac82-88b47eeb7183","Type":"ContainerDied","Data":"334b862da05ac9ff3d6b3dec2c6494aad6eb33459a56fc5109dfb2940e05e80f"} Apr 17 16:49:21.991783 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:21.991735 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.27:8643/healthz\": dial tcp 10.134.0.27:8643: connect: connection refused" Apr 17 16:49:24.437590 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:24.437554 2569 generic.go:358] "Generic (PLEG): container finished" podID="e8fc05a8-d184-4066-96e9-593eeddd8107" containerID="ebd9801242116b714737a9ffb7d4172fbb46409c37251929cf48f34d56fe4574" exitCode=0 Apr 17 16:49:24.437962 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:24.437621 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" event={"ID":"e8fc05a8-d184-4066-96e9-593eeddd8107","Type":"ContainerDied","Data":"ebd9801242116b714737a9ffb7d4172fbb46409c37251929cf48f34d56fe4574"} Apr 17 16:49:25.442750 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:25.442716 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" event={"ID":"e8fc05a8-d184-4066-96e9-593eeddd8107","Type":"ContainerStarted","Data":"2969d9d3854d7002941a4215540d03fa29a3da863d76966ab475e0d08efa5a53"} Apr 17 16:49:25.442750 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:25.442767 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" event={"ID":"e8fc05a8-d184-4066-96e9-593eeddd8107","Type":"ContainerStarted","Data":"54d0da4d6ec8f12b1507d2e676017caeca285dc0ce286cfbead1b12200029b59"} Apr 17 16:49:25.443165 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:25.443075 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" Apr 17 16:49:25.462918 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:25.462871 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" podStartSLOduration=6.462855864 podStartE2EDuration="6.462855864s" podCreationTimestamp="2026-04-17 16:49:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:49:25.461613528 +0000 UTC m=+1062.984639111" watchObservedRunningTime="2026-04-17 16:49:25.462855864 +0000 UTC m=+1062.985881442" Apr 17 16:49:26.445834 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:26.445751 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" Apr 17 16:49:26.447162 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:26.447136 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" podUID="e8fc05a8-d184-4066-96e9-593eeddd8107" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 17 16:49:26.991976 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:26.991933 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.27:8643/healthz\": dial tcp 10.134.0.27:8643: connect: connection refused" Apr 17 16:49:27.448515 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:27.448469 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" podUID="e8fc05a8-d184-4066-96e9-593eeddd8107" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 17 16:49:29.022176 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.022150 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" Apr 17 16:49:29.154368 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.154335 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55rbb\" (UniqueName: \"kubernetes.io/projected/99cd5cac-5bbd-4f52-ac82-88b47eeb7183-kube-api-access-55rbb\") pod \"99cd5cac-5bbd-4f52-ac82-88b47eeb7183\" (UID: \"99cd5cac-5bbd-4f52-ac82-88b47eeb7183\") " Apr 17 16:49:29.154529 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.154401 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-scale-raw-b7bd1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99cd5cac-5bbd-4f52-ac82-88b47eeb7183-isvc-sklearn-scale-raw-b7bd1-kube-rbac-proxy-sar-config\") pod \"99cd5cac-5bbd-4f52-ac82-88b47eeb7183\" (UID: \"99cd5cac-5bbd-4f52-ac82-88b47eeb7183\") " Apr 17 16:49:29.154529 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.154430 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99cd5cac-5bbd-4f52-ac82-88b47eeb7183-proxy-tls\") pod \"99cd5cac-5bbd-4f52-ac82-88b47eeb7183\" (UID: \"99cd5cac-5bbd-4f52-ac82-88b47eeb7183\") " Apr 17 16:49:29.154529 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.154459 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99cd5cac-5bbd-4f52-ac82-88b47eeb7183-kserve-provision-location\") pod \"99cd5cac-5bbd-4f52-ac82-88b47eeb7183\" (UID: \"99cd5cac-5bbd-4f52-ac82-88b47eeb7183\") " Apr 17 16:49:29.154781 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.154738 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99cd5cac-5bbd-4f52-ac82-88b47eeb7183-isvc-sklearn-scale-raw-b7bd1-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-scale-raw-b7bd1-kube-rbac-proxy-sar-config") pod "99cd5cac-5bbd-4f52-ac82-88b47eeb7183" (UID: "99cd5cac-5bbd-4f52-ac82-88b47eeb7183"). InnerVolumeSpecName "isvc-sklearn-scale-raw-b7bd1-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:49:29.154928 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.154900 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99cd5cac-5bbd-4f52-ac82-88b47eeb7183-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "99cd5cac-5bbd-4f52-ac82-88b47eeb7183" (UID: "99cd5cac-5bbd-4f52-ac82-88b47eeb7183"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:49:29.156467 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.156446 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99cd5cac-5bbd-4f52-ac82-88b47eeb7183-kube-api-access-55rbb" (OuterVolumeSpecName: "kube-api-access-55rbb") pod "99cd5cac-5bbd-4f52-ac82-88b47eeb7183" (UID: "99cd5cac-5bbd-4f52-ac82-88b47eeb7183"). InnerVolumeSpecName "kube-api-access-55rbb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:49:29.156609 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.156591 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99cd5cac-5bbd-4f52-ac82-88b47eeb7183-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "99cd5cac-5bbd-4f52-ac82-88b47eeb7183" (UID: "99cd5cac-5bbd-4f52-ac82-88b47eeb7183"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:49:29.255535 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.255504 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-55rbb\" (UniqueName: \"kubernetes.io/projected/99cd5cac-5bbd-4f52-ac82-88b47eeb7183-kube-api-access-55rbb\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:49:29.255535 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.255534 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-scale-raw-b7bd1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99cd5cac-5bbd-4f52-ac82-88b47eeb7183-isvc-sklearn-scale-raw-b7bd1-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:49:29.255741 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.255547 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99cd5cac-5bbd-4f52-ac82-88b47eeb7183-proxy-tls\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:49:29.255741 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.255557 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99cd5cac-5bbd-4f52-ac82-88b47eeb7183-kserve-provision-location\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:49:29.455143 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.455110 2569 generic.go:358] "Generic (PLEG): container finished" podID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerID="5899a8b9b868f8c68c91e566b09ce1694b8d02a3dfbfbe86ed9b5c39023d0b08" exitCode=0 Apr 17 16:49:29.455317 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.455189 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" Apr 17 16:49:29.455317 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.455193 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" event={"ID":"99cd5cac-5bbd-4f52-ac82-88b47eeb7183","Type":"ContainerDied","Data":"5899a8b9b868f8c68c91e566b09ce1694b8d02a3dfbfbe86ed9b5c39023d0b08"} Apr 17 16:49:29.455317 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.455237 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz" event={"ID":"99cd5cac-5bbd-4f52-ac82-88b47eeb7183","Type":"ContainerDied","Data":"4c9ec654b984df3b4a6f1790a99bafbda5f03cb23717db1752f75089463a7a16"} Apr 17 16:49:29.455317 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.455266 2569 scope.go:117] "RemoveContainer" containerID="334b862da05ac9ff3d6b3dec2c6494aad6eb33459a56fc5109dfb2940e05e80f" Apr 17 16:49:29.463601 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.463585 2569 scope.go:117] "RemoveContainer" containerID="5899a8b9b868f8c68c91e566b09ce1694b8d02a3dfbfbe86ed9b5c39023d0b08" Apr 17 16:49:29.470566 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.470549 2569 scope.go:117] "RemoveContainer" containerID="944717c4a48e6706e5ef993587764d660f5cf9884a299e03eb9a0aa7dcd76434" Apr 17 16:49:29.476219 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.476197 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz"] Apr 17 16:49:29.478079 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.477846 2569 scope.go:117] "RemoveContainer" containerID="334b862da05ac9ff3d6b3dec2c6494aad6eb33459a56fc5109dfb2940e05e80f" Apr 17 16:49:29.478175 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:49:29.478142 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"334b862da05ac9ff3d6b3dec2c6494aad6eb33459a56fc5109dfb2940e05e80f\": container with ID starting with 334b862da05ac9ff3d6b3dec2c6494aad6eb33459a56fc5109dfb2940e05e80f not found: ID does not exist" containerID="334b862da05ac9ff3d6b3dec2c6494aad6eb33459a56fc5109dfb2940e05e80f" Apr 17 16:49:29.478245 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.478176 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"334b862da05ac9ff3d6b3dec2c6494aad6eb33459a56fc5109dfb2940e05e80f"} err="failed to get container status \"334b862da05ac9ff3d6b3dec2c6494aad6eb33459a56fc5109dfb2940e05e80f\": rpc error: code = NotFound desc = could not find container \"334b862da05ac9ff3d6b3dec2c6494aad6eb33459a56fc5109dfb2940e05e80f\": container with ID starting with 334b862da05ac9ff3d6b3dec2c6494aad6eb33459a56fc5109dfb2940e05e80f not found: ID does not exist" Apr 17 16:49:29.478245 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.478198 2569 scope.go:117] "RemoveContainer" containerID="5899a8b9b868f8c68c91e566b09ce1694b8d02a3dfbfbe86ed9b5c39023d0b08" Apr 17 16:49:29.478520 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:49:29.478476 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5899a8b9b868f8c68c91e566b09ce1694b8d02a3dfbfbe86ed9b5c39023d0b08\": container with ID starting with 5899a8b9b868f8c68c91e566b09ce1694b8d02a3dfbfbe86ed9b5c39023d0b08 not found: ID does not exist" containerID="5899a8b9b868f8c68c91e566b09ce1694b8d02a3dfbfbe86ed9b5c39023d0b08" Apr 17 16:49:29.478520 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.478505 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5899a8b9b868f8c68c91e566b09ce1694b8d02a3dfbfbe86ed9b5c39023d0b08"} err="failed to get container status \"5899a8b9b868f8c68c91e566b09ce1694b8d02a3dfbfbe86ed9b5c39023d0b08\": rpc error: code = NotFound desc = could not find container \"5899a8b9b868f8c68c91e566b09ce1694b8d02a3dfbfbe86ed9b5c39023d0b08\": container with ID starting with 5899a8b9b868f8c68c91e566b09ce1694b8d02a3dfbfbe86ed9b5c39023d0b08 not found: ID does not exist" Apr 17 16:49:29.478699 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.478531 2569 scope.go:117] "RemoveContainer" containerID="944717c4a48e6706e5ef993587764d660f5cf9884a299e03eb9a0aa7dcd76434" Apr 17 16:49:29.479360 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:49:29.479333 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"944717c4a48e6706e5ef993587764d660f5cf9884a299e03eb9a0aa7dcd76434\": container with ID starting with 944717c4a48e6706e5ef993587764d660f5cf9884a299e03eb9a0aa7dcd76434 not found: ID does not exist" containerID="944717c4a48e6706e5ef993587764d660f5cf9884a299e03eb9a0aa7dcd76434" Apr 17 16:49:29.479460 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.479362 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"944717c4a48e6706e5ef993587764d660f5cf9884a299e03eb9a0aa7dcd76434"} err="failed to get container status \"944717c4a48e6706e5ef993587764d660f5cf9884a299e03eb9a0aa7dcd76434\": rpc error: code = NotFound desc = could not find container \"944717c4a48e6706e5ef993587764d660f5cf9884a299e03eb9a0aa7dcd76434\": container with ID starting with 944717c4a48e6706e5ef993587764d660f5cf9884a299e03eb9a0aa7dcd76434 not found: ID does not exist" Apr 17 16:49:29.480194 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:29.480170 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-b7bd1-predictor-68447964c5-ps2dz"] Apr 17 16:49:31.040898 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:31.040862 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" path="/var/lib/kubelet/pods/99cd5cac-5bbd-4f52-ac82-88b47eeb7183/volumes" Apr 17 16:49:32.452453 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:32.452421 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" Apr 17 16:49:32.452989 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:32.452960 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" podUID="e8fc05a8-d184-4066-96e9-593eeddd8107" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 17 16:49:42.453160 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:42.453122 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" podUID="e8fc05a8-d184-4066-96e9-593eeddd8107" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 17 16:49:52.453610 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:49:52.453568 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" podUID="e8fc05a8-d184-4066-96e9-593eeddd8107" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 17 16:50:02.453024 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:02.452976 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" podUID="e8fc05a8-d184-4066-96e9-593eeddd8107" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 17 16:50:12.453715 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:12.453676 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" podUID="e8fc05a8-d184-4066-96e9-593eeddd8107" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 17 16:50:22.453621 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:22.453578 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" podUID="e8fc05a8-d184-4066-96e9-593eeddd8107" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 17 16:50:32.454442 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:32.454411 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" Apr 17 16:50:39.750470 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.750423 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75"] Apr 17 16:50:39.750908 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.750768 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="kserve-container" Apr 17 16:50:39.750908 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.750789 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="kserve-container" Apr 17 16:50:39.750908 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.750808 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="kube-rbac-proxy" Apr 17 16:50:39.750908 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.750815 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="kube-rbac-proxy" Apr 17 16:50:39.750908 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.750826 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="storage-initializer" Apr 17 16:50:39.750908 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.750835 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="storage-initializer" Apr 17 16:50:39.750908 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.750891 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="kube-rbac-proxy" Apr 17 16:50:39.750908 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.750903 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="99cd5cac-5bbd-4f52-ac82-88b47eeb7183" containerName="kserve-container" Apr 17 16:50:39.754316 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.754298 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75" Apr 17 16:50:39.756963 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.756938 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-c62bcc\"" Apr 17 16:50:39.757105 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.756989 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-c62bcc-kube-rbac-proxy-sar-config\"" Apr 17 16:50:39.757105 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.757026 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 17 16:50:39.757235 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.757173 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-c62bcc-predictor-serving-cert\"" Apr 17 16:50:39.757865 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.757850 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-c62bcc-dockercfg-fkn4n\"" Apr 17 16:50:39.763077 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.763058 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75"] Apr 17 16:50:39.846624 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.846594 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68b64587-ef25-439b-b5bb-c1240151639b-proxy-tls\") pod \"isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75\" (UID: \"68b64587-ef25-439b-b5bb-c1240151639b\") " pod="kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75" Apr 17 16:50:39.846800 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.846666 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-c62bcc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/68b64587-ef25-439b-b5bb-c1240151639b-isvc-secondary-c62bcc-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75\" (UID: \"68b64587-ef25-439b-b5bb-c1240151639b\") " pod="kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75" Apr 17 16:50:39.846800 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.846695 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/68b64587-ef25-439b-b5bb-c1240151639b-kserve-provision-location\") pod \"isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75\" (UID: \"68b64587-ef25-439b-b5bb-c1240151639b\") " pod="kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75" Apr 17 16:50:39.846800 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.846711 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/68b64587-ef25-439b-b5bb-c1240151639b-cabundle-cert\") pod \"isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75\" (UID: \"68b64587-ef25-439b-b5bb-c1240151639b\") " pod="kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75" Apr 17 16:50:39.846800 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.846787 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgwvg\" (UniqueName: \"kubernetes.io/projected/68b64587-ef25-439b-b5bb-c1240151639b-kube-api-access-bgwvg\") pod \"isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75\" (UID: \"68b64587-ef25-439b-b5bb-c1240151639b\") " pod="kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75" Apr 17 16:50:39.947466 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.947420 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-c62bcc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/68b64587-ef25-439b-b5bb-c1240151639b-isvc-secondary-c62bcc-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75\" (UID: \"68b64587-ef25-439b-b5bb-c1240151639b\") " pod="kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75" Apr 17 16:50:39.947466 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.947471 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/68b64587-ef25-439b-b5bb-c1240151639b-kserve-provision-location\") pod \"isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75\" (UID: \"68b64587-ef25-439b-b5bb-c1240151639b\") " pod="kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75" Apr 17 16:50:39.947729 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.947561 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/68b64587-ef25-439b-b5bb-c1240151639b-cabundle-cert\") pod \"isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75\" (UID: \"68b64587-ef25-439b-b5bb-c1240151639b\") " pod="kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75" Apr 17 16:50:39.947729 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.947630 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgwvg\" (UniqueName: \"kubernetes.io/projected/68b64587-ef25-439b-b5bb-c1240151639b-kube-api-access-bgwvg\") pod \"isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75\" (UID: \"68b64587-ef25-439b-b5bb-c1240151639b\") " pod="kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75" Apr 17 16:50:39.947729 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.947658 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68b64587-ef25-439b-b5bb-c1240151639b-proxy-tls\") pod \"isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75\" (UID: \"68b64587-ef25-439b-b5bb-c1240151639b\") " pod="kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75" Apr 17 16:50:39.947903 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.947880 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/68b64587-ef25-439b-b5bb-c1240151639b-kserve-provision-location\") pod \"isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75\" (UID: \"68b64587-ef25-439b-b5bb-c1240151639b\") " pod="kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75" Apr 17 16:50:39.948138 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.948115 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-c62bcc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/68b64587-ef25-439b-b5bb-c1240151639b-isvc-secondary-c62bcc-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75\" (UID: \"68b64587-ef25-439b-b5bb-c1240151639b\") " pod="kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75" Apr 17 16:50:39.948222 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.948179 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/68b64587-ef25-439b-b5bb-c1240151639b-cabundle-cert\") pod \"isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75\" (UID: \"68b64587-ef25-439b-b5bb-c1240151639b\") " pod="kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75" Apr 17 16:50:39.949983 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.949965 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68b64587-ef25-439b-b5bb-c1240151639b-proxy-tls\") pod \"isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75\" (UID: \"68b64587-ef25-439b-b5bb-c1240151639b\") " pod="kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75" Apr 17 16:50:39.955925 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:39.955900 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgwvg\" (UniqueName: \"kubernetes.io/projected/68b64587-ef25-439b-b5bb-c1240151639b-kube-api-access-bgwvg\") pod \"isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75\" (UID: \"68b64587-ef25-439b-b5bb-c1240151639b\") " pod="kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75" Apr 17 16:50:40.065878 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:40.065790 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75" Apr 17 16:50:40.184656 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:40.184595 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75"] Apr 17 16:50:40.187316 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:50:40.187285 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68b64587_ef25_439b_b5bb_c1240151639b.slice/crio-880f1d282c660591b94522194489f2e6aed64cec0fc0db14c5ef32f0e7d6b818 WatchSource:0}: Error finding container 880f1d282c660591b94522194489f2e6aed64cec0fc0db14c5ef32f0e7d6b818: Status 404 returned error can't find the container with id 880f1d282c660591b94522194489f2e6aed64cec0fc0db14c5ef32f0e7d6b818 Apr 17 16:50:40.189149 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:40.189131 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:50:40.668548 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:40.668505 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75" event={"ID":"68b64587-ef25-439b-b5bb-c1240151639b","Type":"ContainerStarted","Data":"4aa3772ed399de0cee418dea6be40989bc45c02993d5ce06d61956402e8d62e6"} Apr 17 16:50:40.668719 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:40.668553 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75" event={"ID":"68b64587-ef25-439b-b5bb-c1240151639b","Type":"ContainerStarted","Data":"880f1d282c660591b94522194489f2e6aed64cec0fc0db14c5ef32f0e7d6b818"} Apr 17 16:50:46.687863 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:46.687784 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75_68b64587-ef25-439b-b5bb-c1240151639b/storage-initializer/0.log" Apr 17 16:50:46.687863 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:46.687825 2569 generic.go:358] "Generic (PLEG): container finished" podID="68b64587-ef25-439b-b5bb-c1240151639b" containerID="4aa3772ed399de0cee418dea6be40989bc45c02993d5ce06d61956402e8d62e6" exitCode=1 Apr 17 16:50:46.688238 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:46.687886 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75" event={"ID":"68b64587-ef25-439b-b5bb-c1240151639b","Type":"ContainerDied","Data":"4aa3772ed399de0cee418dea6be40989bc45c02993d5ce06d61956402e8d62e6"} Apr 17 16:50:47.692150 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:47.692123 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75_68b64587-ef25-439b-b5bb-c1240151639b/storage-initializer/0.log" Apr 17 16:50:47.692592 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:47.692179 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75" event={"ID":"68b64587-ef25-439b-b5bb-c1240151639b","Type":"ContainerStarted","Data":"8c379514086f836ef067ac5b374f04a568725fa687f50fab3ed017ee2581bdc9"} Apr 17 16:50:50.702378 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:50.702348 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75_68b64587-ef25-439b-b5bb-c1240151639b/storage-initializer/1.log" Apr 17 16:50:50.702758 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:50.702706 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75_68b64587-ef25-439b-b5bb-c1240151639b/storage-initializer/0.log" Apr 17 16:50:50.702758 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:50.702739 2569 generic.go:358] "Generic (PLEG): container finished" podID="68b64587-ef25-439b-b5bb-c1240151639b" containerID="8c379514086f836ef067ac5b374f04a568725fa687f50fab3ed017ee2581bdc9" exitCode=1 Apr 17 16:50:50.702874 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:50.702766 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75" event={"ID":"68b64587-ef25-439b-b5bb-c1240151639b","Type":"ContainerDied","Data":"8c379514086f836ef067ac5b374f04a568725fa687f50fab3ed017ee2581bdc9"} Apr 17 16:50:50.702874 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:50.702805 2569 scope.go:117] "RemoveContainer" containerID="4aa3772ed399de0cee418dea6be40989bc45c02993d5ce06d61956402e8d62e6" Apr 17 16:50:50.703218 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:50.703198 2569 scope.go:117] "RemoveContainer" containerID="4aa3772ed399de0cee418dea6be40989bc45c02993d5ce06d61956402e8d62e6" Apr 17 16:50:50.713102 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:50:50.713067 2569 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75_kserve-ci-e2e-test_68b64587-ef25-439b-b5bb-c1240151639b_0 in pod sandbox 880f1d282c660591b94522194489f2e6aed64cec0fc0db14c5ef32f0e7d6b818 from index: no such id: '4aa3772ed399de0cee418dea6be40989bc45c02993d5ce06d61956402e8d62e6'" containerID="4aa3772ed399de0cee418dea6be40989bc45c02993d5ce06d61956402e8d62e6" Apr 17 16:50:50.713197 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:50:50.713126 2569 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75_kserve-ci-e2e-test_68b64587-ef25-439b-b5bb-c1240151639b_0 in pod sandbox 880f1d282c660591b94522194489f2e6aed64cec0fc0db14c5ef32f0e7d6b818 from index: no such id: '4aa3772ed399de0cee418dea6be40989bc45c02993d5ce06d61956402e8d62e6'; Skipping pod \"isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75_kserve-ci-e2e-test(68b64587-ef25-439b-b5bb-c1240151639b)\"" logger="UnhandledError" Apr 17 16:50:50.714430 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:50:50.714406 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75_kserve-ci-e2e-test(68b64587-ef25-439b-b5bb-c1240151639b)\"" pod="kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75" podUID="68b64587-ef25-439b-b5bb-c1240151639b" Apr 17 16:50:51.706734 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:51.706706 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75_68b64587-ef25-439b-b5bb-c1240151639b/storage-initializer/1.log" Apr 17 16:50:57.803988 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:57.803908 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75"] Apr 17 16:50:57.855433 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:57.855390 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc"] Apr 17 16:50:57.856142 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:57.856107 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" podUID="e8fc05a8-d184-4066-96e9-593eeddd8107" containerName="kube-rbac-proxy" containerID="cri-o://2969d9d3854d7002941a4215540d03fa29a3da863d76966ab475e0d08efa5a53" gracePeriod=30 Apr 17 16:50:57.856587 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:57.856529 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" podUID="e8fc05a8-d184-4066-96e9-593eeddd8107" containerName="kserve-container" containerID="cri-o://54d0da4d6ec8f12b1507d2e676017caeca285dc0ce286cfbead1b12200029b59" gracePeriod=30 Apr 17 16:50:57.941208 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:57.941163 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg"] Apr 17 16:50:57.946147 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:57.946127 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" Apr 17 16:50:57.949928 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:57.949907 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-da38c6-predictor-serving-cert\"" Apr 17 16:50:57.950092 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:57.950070 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-da38c6-kube-rbac-proxy-sar-config\"" Apr 17 16:50:57.950166 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:57.950146 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-da38c6\"" Apr 17 16:50:57.950166 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:57.950082 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-da38c6-dockercfg-g29f5\"" Apr 17 16:50:57.956787 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:57.956765 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg"] Apr 17 16:50:57.977937 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:57.977920 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75_68b64587-ef25-439b-b5bb-c1240151639b/storage-initializer/1.log" Apr 17 16:50:57.978031 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:57.977980 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75" Apr 17 16:50:57.992708 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:57.992685 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-cabundle-cert\") pod \"isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg\" (UID: \"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4\") " pod="kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" Apr 17 16:50:57.992790 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:57.992723 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-kserve-provision-location\") pod \"isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg\" (UID: \"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4\") " pod="kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" Apr 17 16:50:57.992790 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:57.992751 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-da38c6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-isvc-init-fail-da38c6-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg\" (UID: \"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4\") " pod="kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" Apr 17 16:50:57.992880 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:57.992795 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bht22\" (UniqueName: \"kubernetes.io/projected/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-kube-api-access-bht22\") pod \"isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg\" (UID: \"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4\") " pod="kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" Apr 17 16:50:57.992922 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:57.992904 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-proxy-tls\") pod \"isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg\" (UID: \"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4\") " pod="kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" Apr 17 16:50:58.094021 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.093920 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68b64587-ef25-439b-b5bb-c1240151639b-proxy-tls\") pod \"68b64587-ef25-439b-b5bb-c1240151639b\" (UID: \"68b64587-ef25-439b-b5bb-c1240151639b\") " Apr 17 16:50:58.094021 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.093962 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgwvg\" (UniqueName: \"kubernetes.io/projected/68b64587-ef25-439b-b5bb-c1240151639b-kube-api-access-bgwvg\") pod \"68b64587-ef25-439b-b5bb-c1240151639b\" (UID: \"68b64587-ef25-439b-b5bb-c1240151639b\") " Apr 17 16:50:58.094021 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.093987 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/68b64587-ef25-439b-b5bb-c1240151639b-cabundle-cert\") pod \"68b64587-ef25-439b-b5bb-c1240151639b\" (UID: \"68b64587-ef25-439b-b5bb-c1240151639b\") " Apr 17 16:50:58.094334 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.094026 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/68b64587-ef25-439b-b5bb-c1240151639b-kserve-provision-location\") pod \"68b64587-ef25-439b-b5bb-c1240151639b\" (UID: \"68b64587-ef25-439b-b5bb-c1240151639b\") " Apr 17 16:50:58.094334 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.094055 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-c62bcc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/68b64587-ef25-439b-b5bb-c1240151639b-isvc-secondary-c62bcc-kube-rbac-proxy-sar-config\") pod \"68b64587-ef25-439b-b5bb-c1240151639b\" (UID: \"68b64587-ef25-439b-b5bb-c1240151639b\") " Apr 17 16:50:58.094334 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.094187 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bht22\" (UniqueName: \"kubernetes.io/projected/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-kube-api-access-bht22\") pod \"isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg\" (UID: \"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4\") " pod="kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" Apr 17 16:50:58.094334 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.094296 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-proxy-tls\") pod \"isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg\" (UID: \"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4\") " pod="kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" Apr 17 16:50:58.094544 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.094339 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-cabundle-cert\") pod \"isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg\" (UID: \"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4\") " pod="kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" Apr 17 16:50:58.094544 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.094348 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68b64587-ef25-439b-b5bb-c1240151639b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "68b64587-ef25-439b-b5bb-c1240151639b" (UID: "68b64587-ef25-439b-b5bb-c1240151639b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:50:58.094544 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.094386 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-kserve-provision-location\") pod \"isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg\" (UID: \"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4\") " pod="kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" Apr 17 16:50:58.094544 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.094432 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-da38c6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-isvc-init-fail-da38c6-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg\" (UID: \"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4\") " pod="kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" Apr 17 16:50:58.094544 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.094450 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68b64587-ef25-439b-b5bb-c1240151639b-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "68b64587-ef25-439b-b5bb-c1240151639b" (UID: "68b64587-ef25-439b-b5bb-c1240151639b"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:50:58.094807 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.094561 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/68b64587-ef25-439b-b5bb-c1240151639b-kserve-provision-location\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:50:58.094807 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.094592 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68b64587-ef25-439b-b5bb-c1240151639b-isvc-secondary-c62bcc-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-c62bcc-kube-rbac-proxy-sar-config") pod "68b64587-ef25-439b-b5bb-c1240151639b" (UID: "68b64587-ef25-439b-b5bb-c1240151639b"). InnerVolumeSpecName "isvc-secondary-c62bcc-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:50:58.094807 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:50:58.094716 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-serving-cert: secret "isvc-init-fail-da38c6-predictor-serving-cert" not found Apr 17 16:50:58.094807 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:50:58.094802 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-proxy-tls podName:dd6a2e8a-d95d-4bba-b88a-3fd625029bb4 nodeName:}" failed. No retries permitted until 2026-04-17 16:50:58.594771989 +0000 UTC m=+1156.117797554 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-proxy-tls") pod "isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" (UID: "dd6a2e8a-d95d-4bba-b88a-3fd625029bb4") : secret "isvc-init-fail-da38c6-predictor-serving-cert" not found Apr 17 16:50:58.095041 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.094891 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-kserve-provision-location\") pod \"isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg\" (UID: \"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4\") " pod="kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" Apr 17 16:50:58.095099 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.095061 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-cabundle-cert\") pod \"isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg\" (UID: \"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4\") " pod="kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" Apr 17 16:50:58.095179 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.095155 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-da38c6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-isvc-init-fail-da38c6-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg\" (UID: \"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4\") " pod="kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" Apr 17 16:50:58.096224 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.096195 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68b64587-ef25-439b-b5bb-c1240151639b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "68b64587-ef25-439b-b5bb-c1240151639b" (UID: "68b64587-ef25-439b-b5bb-c1240151639b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:50:58.096397 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.096376 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68b64587-ef25-439b-b5bb-c1240151639b-kube-api-access-bgwvg" (OuterVolumeSpecName: "kube-api-access-bgwvg") pod "68b64587-ef25-439b-b5bb-c1240151639b" (UID: "68b64587-ef25-439b-b5bb-c1240151639b"). InnerVolumeSpecName "kube-api-access-bgwvg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:50:58.102986 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.102965 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bht22\" (UniqueName: \"kubernetes.io/projected/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-kube-api-access-bht22\") pod \"isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg\" (UID: \"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4\") " pod="kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" Apr 17 16:50:58.195780 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.195750 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68b64587-ef25-439b-b5bb-c1240151639b-proxy-tls\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:50:58.195780 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.195776 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bgwvg\" (UniqueName: \"kubernetes.io/projected/68b64587-ef25-439b-b5bb-c1240151639b-kube-api-access-bgwvg\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:50:58.195980 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.195789 2569 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/68b64587-ef25-439b-b5bb-c1240151639b-cabundle-cert\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:50:58.195980 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.195799 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-c62bcc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/68b64587-ef25-439b-b5bb-c1240151639b-isvc-secondary-c62bcc-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:50:58.599697 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.599648 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-proxy-tls\") pod \"isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg\" (UID: \"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4\") " pod="kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" Apr 17 16:50:58.602157 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.602136 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-proxy-tls\") pod \"isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg\" (UID: \"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4\") " pod="kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" Apr 17 16:50:58.729557 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.729519 2569 generic.go:358] "Generic (PLEG): container finished" podID="e8fc05a8-d184-4066-96e9-593eeddd8107" containerID="2969d9d3854d7002941a4215540d03fa29a3da863d76966ab475e0d08efa5a53" exitCode=2 Apr 17 16:50:58.729711 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.729593 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" event={"ID":"e8fc05a8-d184-4066-96e9-593eeddd8107","Type":"ContainerDied","Data":"2969d9d3854d7002941a4215540d03fa29a3da863d76966ab475e0d08efa5a53"} Apr 17 16:50:58.730760 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.730741 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75_68b64587-ef25-439b-b5bb-c1240151639b/storage-initializer/1.log" Apr 17 16:50:58.730856 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.730817 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75" event={"ID":"68b64587-ef25-439b-b5bb-c1240151639b","Type":"ContainerDied","Data":"880f1d282c660591b94522194489f2e6aed64cec0fc0db14c5ef32f0e7d6b818"} Apr 17 16:50:58.730856 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.730844 2569 scope.go:117] "RemoveContainer" containerID="8c379514086f836ef067ac5b374f04a568725fa687f50fab3ed017ee2581bdc9" Apr 17 16:50:58.730947 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.730885 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75" Apr 17 16:50:58.766961 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.766934 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75"] Apr 17 16:50:58.770901 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.770878 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-c62bcc-predictor-6d9cfb45f7-5lf75"] Apr 17 16:50:58.857122 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.857091 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" Apr 17 16:50:58.979337 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:58.979314 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg"] Apr 17 16:50:58.981864 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:50:58.981835 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd6a2e8a_d95d_4bba_b88a_3fd625029bb4.slice/crio-135eb6cb9801f7689c9c0b86b275dfb412859ba1799d92c7223651d9075c3323 WatchSource:0}: Error finding container 135eb6cb9801f7689c9c0b86b275dfb412859ba1799d92c7223651d9075c3323: Status 404 returned error can't find the container with id 135eb6cb9801f7689c9c0b86b275dfb412859ba1799d92c7223651d9075c3323 Apr 17 16:50:59.041973 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:59.041948 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68b64587-ef25-439b-b5bb-c1240151639b" path="/var/lib/kubelet/pods/68b64587-ef25-439b-b5bb-c1240151639b/volumes" Apr 17 16:50:59.735282 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:59.735231 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" event={"ID":"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4","Type":"ContainerStarted","Data":"995c1cb5a6e0aca26e5ecb655ede38001e38990ac61a250d2c8e7b0b024b4d46"} Apr 17 16:50:59.735282 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:50:59.735279 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" event={"ID":"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4","Type":"ContainerStarted","Data":"135eb6cb9801f7689c9c0b86b275dfb412859ba1799d92c7223651d9075c3323"} Apr 17 16:51:02.215326 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.215298 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" Apr 17 16:51:02.332417 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.332318 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8fc05a8-d184-4066-96e9-593eeddd8107-kserve-provision-location\") pod \"e8fc05a8-d184-4066-96e9-593eeddd8107\" (UID: \"e8fc05a8-d184-4066-96e9-593eeddd8107\") " Apr 17 16:51:02.332417 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.332366 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzpw7\" (UniqueName: \"kubernetes.io/projected/e8fc05a8-d184-4066-96e9-593eeddd8107-kube-api-access-bzpw7\") pod \"e8fc05a8-d184-4066-96e9-593eeddd8107\" (UID: \"e8fc05a8-d184-4066-96e9-593eeddd8107\") " Apr 17 16:51:02.332417 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.332418 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8fc05a8-d184-4066-96e9-593eeddd8107-proxy-tls\") pod \"e8fc05a8-d184-4066-96e9-593eeddd8107\" (UID: \"e8fc05a8-d184-4066-96e9-593eeddd8107\") " Apr 17 16:51:02.332704 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.332442 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-c62bcc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e8fc05a8-d184-4066-96e9-593eeddd8107-isvc-primary-c62bcc-kube-rbac-proxy-sar-config\") pod \"e8fc05a8-d184-4066-96e9-593eeddd8107\" (UID: \"e8fc05a8-d184-4066-96e9-593eeddd8107\") " Apr 17 16:51:02.332704 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.332654 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8fc05a8-d184-4066-96e9-593eeddd8107-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e8fc05a8-d184-4066-96e9-593eeddd8107" (UID: "e8fc05a8-d184-4066-96e9-593eeddd8107"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:51:02.332851 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.332829 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8fc05a8-d184-4066-96e9-593eeddd8107-isvc-primary-c62bcc-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-c62bcc-kube-rbac-proxy-sar-config") pod "e8fc05a8-d184-4066-96e9-593eeddd8107" (UID: "e8fc05a8-d184-4066-96e9-593eeddd8107"). InnerVolumeSpecName "isvc-primary-c62bcc-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:51:02.334678 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.334653 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8fc05a8-d184-4066-96e9-593eeddd8107-kube-api-access-bzpw7" (OuterVolumeSpecName: "kube-api-access-bzpw7") pod "e8fc05a8-d184-4066-96e9-593eeddd8107" (UID: "e8fc05a8-d184-4066-96e9-593eeddd8107"). InnerVolumeSpecName "kube-api-access-bzpw7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:51:02.334781 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.334689 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8fc05a8-d184-4066-96e9-593eeddd8107-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e8fc05a8-d184-4066-96e9-593eeddd8107" (UID: "e8fc05a8-d184-4066-96e9-593eeddd8107"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:51:02.433459 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.433426 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8fc05a8-d184-4066-96e9-593eeddd8107-proxy-tls\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:51:02.433459 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.433457 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-c62bcc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e8fc05a8-d184-4066-96e9-593eeddd8107-isvc-primary-c62bcc-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:51:02.433459 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.433468 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8fc05a8-d184-4066-96e9-593eeddd8107-kserve-provision-location\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:51:02.433695 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.433479 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bzpw7\" (UniqueName: \"kubernetes.io/projected/e8fc05a8-d184-4066-96e9-593eeddd8107-kube-api-access-bzpw7\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:51:02.746417 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.746381 2569 generic.go:358] "Generic (PLEG): container finished" podID="e8fc05a8-d184-4066-96e9-593eeddd8107" containerID="54d0da4d6ec8f12b1507d2e676017caeca285dc0ce286cfbead1b12200029b59" exitCode=0 Apr 17 16:51:02.746563 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.746476 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" Apr 17 16:51:02.746563 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.746475 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" event={"ID":"e8fc05a8-d184-4066-96e9-593eeddd8107","Type":"ContainerDied","Data":"54d0da4d6ec8f12b1507d2e676017caeca285dc0ce286cfbead1b12200029b59"} Apr 17 16:51:02.746649 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.746591 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc" event={"ID":"e8fc05a8-d184-4066-96e9-593eeddd8107","Type":"ContainerDied","Data":"5912e8de32209fd28b9166e418d574a3dec4dbb16acca390adbee0ecb3010b7e"} Apr 17 16:51:02.746649 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.746615 2569 scope.go:117] "RemoveContainer" containerID="2969d9d3854d7002941a4215540d03fa29a3da863d76966ab475e0d08efa5a53" Apr 17 16:51:02.754908 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.754885 2569 scope.go:117] "RemoveContainer" containerID="54d0da4d6ec8f12b1507d2e676017caeca285dc0ce286cfbead1b12200029b59" Apr 17 16:51:02.761727 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.761711 2569 scope.go:117] "RemoveContainer" containerID="ebd9801242116b714737a9ffb7d4172fbb46409c37251929cf48f34d56fe4574" Apr 17 16:51:02.768661 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.768643 2569 scope.go:117] "RemoveContainer" containerID="2969d9d3854d7002941a4215540d03fa29a3da863d76966ab475e0d08efa5a53" Apr 17 16:51:02.768958 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:51:02.768931 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2969d9d3854d7002941a4215540d03fa29a3da863d76966ab475e0d08efa5a53\": container with ID starting with 2969d9d3854d7002941a4215540d03fa29a3da863d76966ab475e0d08efa5a53 not found: ID does not exist" containerID="2969d9d3854d7002941a4215540d03fa29a3da863d76966ab475e0d08efa5a53" Apr 17 16:51:02.769080 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.768971 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2969d9d3854d7002941a4215540d03fa29a3da863d76966ab475e0d08efa5a53"} err="failed to get container status \"2969d9d3854d7002941a4215540d03fa29a3da863d76966ab475e0d08efa5a53\": rpc error: code = NotFound desc = could not find container \"2969d9d3854d7002941a4215540d03fa29a3da863d76966ab475e0d08efa5a53\": container with ID starting with 2969d9d3854d7002941a4215540d03fa29a3da863d76966ab475e0d08efa5a53 not found: ID does not exist" Apr 17 16:51:02.769080 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.768997 2569 scope.go:117] "RemoveContainer" containerID="54d0da4d6ec8f12b1507d2e676017caeca285dc0ce286cfbead1b12200029b59" Apr 17 16:51:02.769326 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:51:02.769304 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54d0da4d6ec8f12b1507d2e676017caeca285dc0ce286cfbead1b12200029b59\": container with ID starting with 54d0da4d6ec8f12b1507d2e676017caeca285dc0ce286cfbead1b12200029b59 not found: ID does not exist" containerID="54d0da4d6ec8f12b1507d2e676017caeca285dc0ce286cfbead1b12200029b59" Apr 17 16:51:02.769456 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.769434 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54d0da4d6ec8f12b1507d2e676017caeca285dc0ce286cfbead1b12200029b59"} err="failed to get container status \"54d0da4d6ec8f12b1507d2e676017caeca285dc0ce286cfbead1b12200029b59\": rpc error: code = NotFound desc = could not find container \"54d0da4d6ec8f12b1507d2e676017caeca285dc0ce286cfbead1b12200029b59\": container with ID starting with 54d0da4d6ec8f12b1507d2e676017caeca285dc0ce286cfbead1b12200029b59 not found: ID does not exist" Apr 17 16:51:02.769558 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.769529 2569 scope.go:117] "RemoveContainer" containerID="ebd9801242116b714737a9ffb7d4172fbb46409c37251929cf48f34d56fe4574" Apr 17 16:51:02.769860 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:51:02.769800 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebd9801242116b714737a9ffb7d4172fbb46409c37251929cf48f34d56fe4574\": container with ID starting with ebd9801242116b714737a9ffb7d4172fbb46409c37251929cf48f34d56fe4574 not found: ID does not exist" containerID="ebd9801242116b714737a9ffb7d4172fbb46409c37251929cf48f34d56fe4574" Apr 17 16:51:02.769860 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.769838 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebd9801242116b714737a9ffb7d4172fbb46409c37251929cf48f34d56fe4574"} err="failed to get container status \"ebd9801242116b714737a9ffb7d4172fbb46409c37251929cf48f34d56fe4574\": rpc error: code = NotFound desc = could not find container \"ebd9801242116b714737a9ffb7d4172fbb46409c37251929cf48f34d56fe4574\": container with ID starting with ebd9801242116b714737a9ffb7d4172fbb46409c37251929cf48f34d56fe4574 not found: ID does not exist" Apr 17 16:51:02.771198 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.771178 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc"] Apr 17 16:51:02.775839 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:02.775818 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-c62bcc-predictor-75f99866db-zhgfc"] Apr 17 16:51:03.042380 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:03.042301 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8fc05a8-d184-4066-96e9-593eeddd8107" path="/var/lib/kubelet/pods/e8fc05a8-d184-4066-96e9-593eeddd8107/volumes" Apr 17 16:51:03.751870 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:03.751842 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg_dd6a2e8a-d95d-4bba-b88a-3fd625029bb4/storage-initializer/0.log" Apr 17 16:51:03.752203 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:03.751890 2569 generic.go:358] "Generic (PLEG): container finished" podID="dd6a2e8a-d95d-4bba-b88a-3fd625029bb4" containerID="995c1cb5a6e0aca26e5ecb655ede38001e38990ac61a250d2c8e7b0b024b4d46" exitCode=1 Apr 17 16:51:03.752203 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:03.751937 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" event={"ID":"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4","Type":"ContainerDied","Data":"995c1cb5a6e0aca26e5ecb655ede38001e38990ac61a250d2c8e7b0b024b4d46"} Apr 17 16:51:04.756172 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:04.756145 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg_dd6a2e8a-d95d-4bba-b88a-3fd625029bb4/storage-initializer/0.log" Apr 17 16:51:04.756560 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:04.756271 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" event={"ID":"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4","Type":"ContainerStarted","Data":"b5b37cd45148962c51c68294c0d6bcdce264c7ce92fc36653ac89fdeac4db0a0"} Apr 17 16:51:07.765242 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:07.765211 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg_dd6a2e8a-d95d-4bba-b88a-3fd625029bb4/storage-initializer/1.log" Apr 17 16:51:07.765615 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:07.765600 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg_dd6a2e8a-d95d-4bba-b88a-3fd625029bb4/storage-initializer/0.log" Apr 17 16:51:07.765659 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:07.765631 2569 generic.go:358] "Generic (PLEG): container finished" podID="dd6a2e8a-d95d-4bba-b88a-3fd625029bb4" containerID="b5b37cd45148962c51c68294c0d6bcdce264c7ce92fc36653ac89fdeac4db0a0" exitCode=1 Apr 17 16:51:07.765726 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:07.765705 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" event={"ID":"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4","Type":"ContainerDied","Data":"b5b37cd45148962c51c68294c0d6bcdce264c7ce92fc36653ac89fdeac4db0a0"} Apr 17 16:51:07.765759 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:07.765750 2569 scope.go:117] "RemoveContainer" containerID="995c1cb5a6e0aca26e5ecb655ede38001e38990ac61a250d2c8e7b0b024b4d46" Apr 17 16:51:07.766160 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:07.766133 2569 scope.go:117] "RemoveContainer" containerID="995c1cb5a6e0aca26e5ecb655ede38001e38990ac61a250d2c8e7b0b024b4d46" Apr 17 16:51:07.776042 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:51:07.776014 2569 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg_kserve-ci-e2e-test_dd6a2e8a-d95d-4bba-b88a-3fd625029bb4_0 in pod sandbox 135eb6cb9801f7689c9c0b86b275dfb412859ba1799d92c7223651d9075c3323 from index: no such id: '995c1cb5a6e0aca26e5ecb655ede38001e38990ac61a250d2c8e7b0b024b4d46'" containerID="995c1cb5a6e0aca26e5ecb655ede38001e38990ac61a250d2c8e7b0b024b4d46" Apr 17 16:51:07.776130 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:51:07.776065 2569 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg_kserve-ci-e2e-test_dd6a2e8a-d95d-4bba-b88a-3fd625029bb4_0 in pod sandbox 135eb6cb9801f7689c9c0b86b275dfb412859ba1799d92c7223651d9075c3323 from index: no such id: '995c1cb5a6e0aca26e5ecb655ede38001e38990ac61a250d2c8e7b0b024b4d46'; Skipping pod \"isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg_kserve-ci-e2e-test(dd6a2e8a-d95d-4bba-b88a-3fd625029bb4)\"" logger="UnhandledError" Apr 17 16:51:07.777434 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:51:07.777414 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg_kserve-ci-e2e-test(dd6a2e8a-d95d-4bba-b88a-3fd625029bb4)\"" pod="kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" podUID="dd6a2e8a-d95d-4bba-b88a-3fd625029bb4" Apr 17 16:51:07.961146 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:07.961063 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg"] Apr 17 16:51:08.070448 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.070411 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm"] Apr 17 16:51:08.070746 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.070734 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68b64587-ef25-439b-b5bb-c1240151639b" containerName="storage-initializer" Apr 17 16:51:08.070798 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.070747 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b64587-ef25-439b-b5bb-c1240151639b" containerName="storage-initializer" Apr 17 16:51:08.070798 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.070757 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8fc05a8-d184-4066-96e9-593eeddd8107" containerName="kserve-container" Apr 17 16:51:08.070798 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.070763 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8fc05a8-d184-4066-96e9-593eeddd8107" containerName="kserve-container" Apr 17 16:51:08.070798 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.070770 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8fc05a8-d184-4066-96e9-593eeddd8107" containerName="kube-rbac-proxy" Apr 17 16:51:08.070798 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.070776 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8fc05a8-d184-4066-96e9-593eeddd8107" containerName="kube-rbac-proxy" Apr 17 16:51:08.070798 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.070798 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8fc05a8-d184-4066-96e9-593eeddd8107" containerName="storage-initializer" Apr 17 16:51:08.070976 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.070806 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8fc05a8-d184-4066-96e9-593eeddd8107" containerName="storage-initializer" Apr 17 16:51:08.070976 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.070812 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68b64587-ef25-439b-b5bb-c1240151639b" containerName="storage-initializer" Apr 17 16:51:08.070976 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.070818 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b64587-ef25-439b-b5bb-c1240151639b" containerName="storage-initializer" Apr 17 16:51:08.070976 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.070871 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8fc05a8-d184-4066-96e9-593eeddd8107" containerName="kube-rbac-proxy" Apr 17 16:51:08.070976 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.070880 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="68b64587-ef25-439b-b5bb-c1240151639b" containerName="storage-initializer" Apr 17 16:51:08.070976 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.070888 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8fc05a8-d184-4066-96e9-593eeddd8107" containerName="kserve-container" Apr 17 16:51:08.070976 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.070977 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="68b64587-ef25-439b-b5bb-c1240151639b" containerName="storage-initializer" Apr 17 16:51:08.075290 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.075267 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" Apr 17 16:51:08.077712 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.077682 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-466a7-kube-rbac-proxy-sar-config\"" Apr 17 16:51:08.077849 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.077720 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-7cr77\"" Apr 17 16:51:08.077849 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.077762 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-466a7-predictor-serving-cert\"" Apr 17 16:51:08.082666 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.082634 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm"] Apr 17 16:51:08.184728 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.184698 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a91ab773-e143-4b6d-ab53-9b3b5f1f887a-proxy-tls\") pod \"raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm\" (UID: \"a91ab773-e143-4b6d-ab53-9b3b5f1f887a\") " pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" Apr 17 16:51:08.184872 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.184741 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-466a7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a91ab773-e143-4b6d-ab53-9b3b5f1f887a-raw-sklearn-466a7-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm\" (UID: \"a91ab773-e143-4b6d-ab53-9b3b5f1f887a\") " pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" Apr 17 16:51:08.184872 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.184791 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58gld\" (UniqueName: \"kubernetes.io/projected/a91ab773-e143-4b6d-ab53-9b3b5f1f887a-kube-api-access-58gld\") pod \"raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm\" (UID: \"a91ab773-e143-4b6d-ab53-9b3b5f1f887a\") " pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" Apr 17 16:51:08.184872 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.184820 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a91ab773-e143-4b6d-ab53-9b3b5f1f887a-kserve-provision-location\") pod \"raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm\" (UID: \"a91ab773-e143-4b6d-ab53-9b3b5f1f887a\") " pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" Apr 17 16:51:08.285514 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.285419 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a91ab773-e143-4b6d-ab53-9b3b5f1f887a-proxy-tls\") pod \"raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm\" (UID: \"a91ab773-e143-4b6d-ab53-9b3b5f1f887a\") " pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" Apr 17 16:51:08.285514 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.285474 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-466a7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a91ab773-e143-4b6d-ab53-9b3b5f1f887a-raw-sklearn-466a7-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm\" (UID: \"a91ab773-e143-4b6d-ab53-9b3b5f1f887a\") " pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" Apr 17 16:51:08.285514 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.285502 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58gld\" (UniqueName: \"kubernetes.io/projected/a91ab773-e143-4b6d-ab53-9b3b5f1f887a-kube-api-access-58gld\") pod \"raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm\" (UID: \"a91ab773-e143-4b6d-ab53-9b3b5f1f887a\") " pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" Apr 17 16:51:08.285774 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.285519 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a91ab773-e143-4b6d-ab53-9b3b5f1f887a-kserve-provision-location\") pod \"raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm\" (UID: \"a91ab773-e143-4b6d-ab53-9b3b5f1f887a\") " pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" Apr 17 16:51:08.285774 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:51:08.285576 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/raw-sklearn-466a7-predictor-serving-cert: secret "raw-sklearn-466a7-predictor-serving-cert" not found Apr 17 16:51:08.285774 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:51:08.285654 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a91ab773-e143-4b6d-ab53-9b3b5f1f887a-proxy-tls podName:a91ab773-e143-4b6d-ab53-9b3b5f1f887a nodeName:}" failed. No retries permitted until 2026-04-17 16:51:08.78563379 +0000 UTC m=+1166.308659356 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a91ab773-e143-4b6d-ab53-9b3b5f1f887a-proxy-tls") pod "raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" (UID: "a91ab773-e143-4b6d-ab53-9b3b5f1f887a") : secret "raw-sklearn-466a7-predictor-serving-cert" not found Apr 17 16:51:08.285997 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.285982 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a91ab773-e143-4b6d-ab53-9b3b5f1f887a-kserve-provision-location\") pod \"raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm\" (UID: \"a91ab773-e143-4b6d-ab53-9b3b5f1f887a\") " pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" Apr 17 16:51:08.286280 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.286237 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-466a7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a91ab773-e143-4b6d-ab53-9b3b5f1f887a-raw-sklearn-466a7-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm\" (UID: \"a91ab773-e143-4b6d-ab53-9b3b5f1f887a\") " pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" Apr 17 16:51:08.294624 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.294602 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58gld\" (UniqueName: \"kubernetes.io/projected/a91ab773-e143-4b6d-ab53-9b3b5f1f887a-kube-api-access-58gld\") pod \"raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm\" (UID: \"a91ab773-e143-4b6d-ab53-9b3b5f1f887a\") " pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" Apr 17 16:51:08.772632 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.772604 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg_dd6a2e8a-d95d-4bba-b88a-3fd625029bb4/storage-initializer/1.log" Apr 17 16:51:08.790168 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.790142 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a91ab773-e143-4b6d-ab53-9b3b5f1f887a-proxy-tls\") pod \"raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm\" (UID: \"a91ab773-e143-4b6d-ab53-9b3b5f1f887a\") " pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" Apr 17 16:51:08.793371 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.793344 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a91ab773-e143-4b6d-ab53-9b3b5f1f887a-proxy-tls\") pod \"raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm\" (UID: \"a91ab773-e143-4b6d-ab53-9b3b5f1f887a\") " pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" Apr 17 16:51:08.902922 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.902896 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg_dd6a2e8a-d95d-4bba-b88a-3fd625029bb4/storage-initializer/1.log" Apr 17 16:51:08.903040 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.902963 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" Apr 17 16:51:08.986016 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.985962 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" Apr 17 16:51:08.991977 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.991945 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-kserve-provision-location\") pod \"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4\" (UID: \"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4\") " Apr 17 16:51:08.992084 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.992024 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-cabundle-cert\") pod \"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4\" (UID: \"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4\") " Apr 17 16:51:08.992084 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.992055 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-da38c6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-isvc-init-fail-da38c6-kube-rbac-proxy-sar-config\") pod \"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4\" (UID: \"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4\") " Apr 17 16:51:08.992084 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.992072 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bht22\" (UniqueName: \"kubernetes.io/projected/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-kube-api-access-bht22\") pod \"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4\" (UID: \"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4\") " Apr 17 16:51:08.992281 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.992099 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-proxy-tls\") pod \"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4\" (UID: \"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4\") " Apr 17 16:51:08.992466 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.992443 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "dd6a2e8a-d95d-4bba-b88a-3fd625029bb4" (UID: "dd6a2e8a-d95d-4bba-b88a-3fd625029bb4"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:51:08.992533 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.992471 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-isvc-init-fail-da38c6-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-da38c6-kube-rbac-proxy-sar-config") pod "dd6a2e8a-d95d-4bba-b88a-3fd625029bb4" (UID: "dd6a2e8a-d95d-4bba-b88a-3fd625029bb4"). InnerVolumeSpecName "isvc-init-fail-da38c6-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:51:08.992617 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.992584 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dd6a2e8a-d95d-4bba-b88a-3fd625029bb4" (UID: "dd6a2e8a-d95d-4bba-b88a-3fd625029bb4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:51:08.994232 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.994208 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "dd6a2e8a-d95d-4bba-b88a-3fd625029bb4" (UID: "dd6a2e8a-d95d-4bba-b88a-3fd625029bb4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:51:08.994333 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:08.994235 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-kube-api-access-bht22" (OuterVolumeSpecName: "kube-api-access-bht22") pod "dd6a2e8a-d95d-4bba-b88a-3fd625029bb4" (UID: "dd6a2e8a-d95d-4bba-b88a-3fd625029bb4"). InnerVolumeSpecName "kube-api-access-bht22". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:51:09.092720 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:09.092685 2569 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-cabundle-cert\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:51:09.092720 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:09.092717 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-da38c6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-isvc-init-fail-da38c6-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:51:09.092901 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:09.092729 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bht22\" (UniqueName: \"kubernetes.io/projected/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-kube-api-access-bht22\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:51:09.092901 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:09.092741 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-proxy-tls\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:51:09.092901 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:09.092751 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4-kserve-provision-location\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:51:09.108980 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:09.108886 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm"] Apr 17 16:51:09.111132 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:51:09.111100 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda91ab773_e143_4b6d_ab53_9b3b5f1f887a.slice/crio-4abc5437c9b0b24cee5192d868167e110138852780c0c912999f6c5a8a317118 WatchSource:0}: Error finding container 4abc5437c9b0b24cee5192d868167e110138852780c0c912999f6c5a8a317118: Status 404 returned error can't find the container with id 4abc5437c9b0b24cee5192d868167e110138852780c0c912999f6c5a8a317118 Apr 17 16:51:09.776389 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:09.776362 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg_dd6a2e8a-d95d-4bba-b88a-3fd625029bb4/storage-initializer/1.log" Apr 17 16:51:09.776822 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:09.776454 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" event={"ID":"dd6a2e8a-d95d-4bba-b88a-3fd625029bb4","Type":"ContainerDied","Data":"135eb6cb9801f7689c9c0b86b275dfb412859ba1799d92c7223651d9075c3323"} Apr 17 16:51:09.776822 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:09.776495 2569 scope.go:117] "RemoveContainer" containerID="b5b37cd45148962c51c68294c0d6bcdce264c7ce92fc36653ac89fdeac4db0a0" Apr 17 16:51:09.776822 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:09.776517 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg" Apr 17 16:51:09.778047 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:09.777975 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" event={"ID":"a91ab773-e143-4b6d-ab53-9b3b5f1f887a","Type":"ContainerStarted","Data":"2d63e4db31a46fd25e44c279bcfc0dfadf3e5a0c203c185f73b0d6dff1a22e1b"} Apr 17 16:51:09.778204 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:09.778180 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" event={"ID":"a91ab773-e143-4b6d-ab53-9b3b5f1f887a","Type":"ContainerStarted","Data":"4abc5437c9b0b24cee5192d868167e110138852780c0c912999f6c5a8a317118"} Apr 17 16:51:09.810803 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:09.810775 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg"] Apr 17 16:51:09.814882 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:09.814859 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-da38c6-predictor-6494f9f9cf-5hbkg"] Apr 17 16:51:11.040675 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:11.040640 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd6a2e8a-d95d-4bba-b88a-3fd625029bb4" path="/var/lib/kubelet/pods/dd6a2e8a-d95d-4bba-b88a-3fd625029bb4/volumes" Apr 17 16:51:12.788468 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:12.788436 2569 generic.go:358] "Generic (PLEG): container finished" podID="a91ab773-e143-4b6d-ab53-9b3b5f1f887a" containerID="2d63e4db31a46fd25e44c279bcfc0dfadf3e5a0c203c185f73b0d6dff1a22e1b" exitCode=0 Apr 17 16:51:12.788821 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:12.788507 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" event={"ID":"a91ab773-e143-4b6d-ab53-9b3b5f1f887a","Type":"ContainerDied","Data":"2d63e4db31a46fd25e44c279bcfc0dfadf3e5a0c203c185f73b0d6dff1a22e1b"} Apr 17 16:51:13.792834 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:13.792798 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" event={"ID":"a91ab773-e143-4b6d-ab53-9b3b5f1f887a","Type":"ContainerStarted","Data":"2e62f8e6ade98f2eb10971d5d28150b78a530c5ed066e4a3db518773c763bc83"} Apr 17 16:51:13.792834 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:13.792836 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" event={"ID":"a91ab773-e143-4b6d-ab53-9b3b5f1f887a","Type":"ContainerStarted","Data":"70504792f6cf31399573ba9c747418a981f2a79b3c4c8449296622c7a3b1e6c1"} Apr 17 16:51:13.793277 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:13.793129 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" Apr 17 16:51:13.812078 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:13.812029 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" podStartSLOduration=5.812015614 podStartE2EDuration="5.812015614s" podCreationTimestamp="2026-04-17 16:51:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:51:13.810951803 +0000 UTC m=+1171.333977411" watchObservedRunningTime="2026-04-17 16:51:13.812015614 +0000 UTC m=+1171.335041199" Apr 17 16:51:14.796139 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:14.796104 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" Apr 17 16:51:14.797498 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:14.797473 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" podUID="a91ab773-e143-4b6d-ab53-9b3b5f1f887a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 17 16:51:15.801185 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:15.801147 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" podUID="a91ab773-e143-4b6d-ab53-9b3b5f1f887a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 17 16:51:20.805699 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:20.805671 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" Apr 17 16:51:20.806169 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:20.806141 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" podUID="a91ab773-e143-4b6d-ab53-9b3b5f1f887a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 17 16:51:30.806841 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:30.806798 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" podUID="a91ab773-e143-4b6d-ab53-9b3b5f1f887a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 17 16:51:40.806626 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:40.806581 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" podUID="a91ab773-e143-4b6d-ab53-9b3b5f1f887a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 17 16:51:50.806398 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:51:50.806354 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" podUID="a91ab773-e143-4b6d-ab53-9b3b5f1f887a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 17 16:52:00.806812 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:00.806760 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" podUID="a91ab773-e143-4b6d-ab53-9b3b5f1f887a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 17 16:52:10.806845 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:10.806801 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" podUID="a91ab773-e143-4b6d-ab53-9b3b5f1f887a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 17 16:52:20.806898 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:20.806862 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" Apr 17 16:52:28.171763 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.171679 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm"] Apr 17 16:52:28.172143 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.172061 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" podUID="a91ab773-e143-4b6d-ab53-9b3b5f1f887a" containerName="kserve-container" containerID="cri-o://70504792f6cf31399573ba9c747418a981f2a79b3c4c8449296622c7a3b1e6c1" gracePeriod=30 Apr 17 16:52:28.172143 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.172092 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" podUID="a91ab773-e143-4b6d-ab53-9b3b5f1f887a" containerName="kube-rbac-proxy" containerID="cri-o://2e62f8e6ade98f2eb10971d5d28150b78a530c5ed066e4a3db518773c763bc83" gracePeriod=30 Apr 17 16:52:28.298441 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.298411 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t"] Apr 17 16:52:28.298743 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.298731 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd6a2e8a-d95d-4bba-b88a-3fd625029bb4" containerName="storage-initializer" Apr 17 16:52:28.298803 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.298745 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd6a2e8a-d95d-4bba-b88a-3fd625029bb4" containerName="storage-initializer" Apr 17 16:52:28.298846 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.298822 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd6a2e8a-d95d-4bba-b88a-3fd625029bb4" containerName="storage-initializer" Apr 17 16:52:28.298890 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.298883 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd6a2e8a-d95d-4bba-b88a-3fd625029bb4" containerName="storage-initializer" Apr 17 16:52:28.298925 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.298891 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd6a2e8a-d95d-4bba-b88a-3fd625029bb4" containerName="storage-initializer" Apr 17 16:52:28.298966 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.298947 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd6a2e8a-d95d-4bba-b88a-3fd625029bb4" containerName="storage-initializer" Apr 17 16:52:28.301919 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.301900 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" Apr 17 16:52:28.305325 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.305298 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-984d3-predictor-serving-cert\"" Apr 17 16:52:28.305548 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.305535 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-984d3-kube-rbac-proxy-sar-config\"" Apr 17 16:52:28.320172 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.320150 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t"] Apr 17 16:52:28.375521 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.375482 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99f446c2-5bd3-45dd-bb5f-6bc80c44195a-proxy-tls\") pod \"raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t\" (UID: \"99f446c2-5bd3-45dd-bb5f-6bc80c44195a\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" Apr 17 16:52:28.375695 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.375530 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99f446c2-5bd3-45dd-bb5f-6bc80c44195a-kserve-provision-location\") pod \"raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t\" (UID: \"99f446c2-5bd3-45dd-bb5f-6bc80c44195a\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" Apr 17 16:52:28.375695 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.375588 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-runtime-984d3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99f446c2-5bd3-45dd-bb5f-6bc80c44195a-raw-sklearn-runtime-984d3-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t\" (UID: \"99f446c2-5bd3-45dd-bb5f-6bc80c44195a\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" Apr 17 16:52:28.375695 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.375617 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnsb8\" (UniqueName: \"kubernetes.io/projected/99f446c2-5bd3-45dd-bb5f-6bc80c44195a-kube-api-access-hnsb8\") pod \"raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t\" (UID: \"99f446c2-5bd3-45dd-bb5f-6bc80c44195a\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" Apr 17 16:52:28.476731 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.476645 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99f446c2-5bd3-45dd-bb5f-6bc80c44195a-proxy-tls\") pod \"raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t\" (UID: \"99f446c2-5bd3-45dd-bb5f-6bc80c44195a\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" Apr 17 16:52:28.476731 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.476696 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99f446c2-5bd3-45dd-bb5f-6bc80c44195a-kserve-provision-location\") pod \"raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t\" (UID: \"99f446c2-5bd3-45dd-bb5f-6bc80c44195a\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" Apr 17 16:52:28.476928 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.476737 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-runtime-984d3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99f446c2-5bd3-45dd-bb5f-6bc80c44195a-raw-sklearn-runtime-984d3-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t\" (UID: \"99f446c2-5bd3-45dd-bb5f-6bc80c44195a\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" Apr 17 16:52:28.476928 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.476769 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hnsb8\" (UniqueName: \"kubernetes.io/projected/99f446c2-5bd3-45dd-bb5f-6bc80c44195a-kube-api-access-hnsb8\") pod \"raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t\" (UID: \"99f446c2-5bd3-45dd-bb5f-6bc80c44195a\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" Apr 17 16:52:28.477118 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.477098 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99f446c2-5bd3-45dd-bb5f-6bc80c44195a-kserve-provision-location\") pod \"raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t\" (UID: \"99f446c2-5bd3-45dd-bb5f-6bc80c44195a\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" Apr 17 16:52:28.477496 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.477476 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-runtime-984d3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99f446c2-5bd3-45dd-bb5f-6bc80c44195a-raw-sklearn-runtime-984d3-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t\" (UID: \"99f446c2-5bd3-45dd-bb5f-6bc80c44195a\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" Apr 17 16:52:28.479119 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.479099 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99f446c2-5bd3-45dd-bb5f-6bc80c44195a-proxy-tls\") pod \"raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t\" (UID: \"99f446c2-5bd3-45dd-bb5f-6bc80c44195a\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" Apr 17 16:52:28.484873 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.484851 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnsb8\" (UniqueName: \"kubernetes.io/projected/99f446c2-5bd3-45dd-bb5f-6bc80c44195a-kube-api-access-hnsb8\") pod \"raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t\" (UID: \"99f446c2-5bd3-45dd-bb5f-6bc80c44195a\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" Apr 17 16:52:28.612107 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.612069 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" Apr 17 16:52:28.731652 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:28.731623 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t"] Apr 17 16:52:28.734322 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:52:28.734287 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99f446c2_5bd3_45dd_bb5f_6bc80c44195a.slice/crio-832017d748856b2edf8eacf28700665507dba8086b0cc0a691e91d245e5be2d6 WatchSource:0}: Error finding container 832017d748856b2edf8eacf28700665507dba8086b0cc0a691e91d245e5be2d6: Status 404 returned error can't find the container with id 832017d748856b2edf8eacf28700665507dba8086b0cc0a691e91d245e5be2d6 Apr 17 16:52:29.013923 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:29.013831 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" event={"ID":"99f446c2-5bd3-45dd-bb5f-6bc80c44195a","Type":"ContainerStarted","Data":"d2b8f083140369e2c6a37a725cf3030778484ee59b163db5a1684a42aa03a3ac"} Apr 17 16:52:29.013923 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:29.013871 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" event={"ID":"99f446c2-5bd3-45dd-bb5f-6bc80c44195a","Type":"ContainerStarted","Data":"832017d748856b2edf8eacf28700665507dba8086b0cc0a691e91d245e5be2d6"} Apr 17 16:52:29.015641 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:29.015618 2569 generic.go:358] "Generic (PLEG): container finished" podID="a91ab773-e143-4b6d-ab53-9b3b5f1f887a" containerID="2e62f8e6ade98f2eb10971d5d28150b78a530c5ed066e4a3db518773c763bc83" exitCode=2 Apr 17 16:52:29.015755 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:29.015668 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" event={"ID":"a91ab773-e143-4b6d-ab53-9b3b5f1f887a","Type":"ContainerDied","Data":"2e62f8e6ade98f2eb10971d5d28150b78a530c5ed066e4a3db518773c763bc83"} Apr 17 16:52:30.802173 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:30.802120 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" podUID="a91ab773-e143-4b6d-ab53-9b3b5f1f887a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.31:8643/healthz\": dial tcp 10.134.0.31:8643: connect: connection refused" Apr 17 16:52:30.806497 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:30.806458 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" podUID="a91ab773-e143-4b6d-ab53-9b3b5f1f887a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 17 16:52:32.705458 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:32.705431 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" Apr 17 16:52:32.814113 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:32.814016 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a91ab773-e143-4b6d-ab53-9b3b5f1f887a-proxy-tls\") pod \"a91ab773-e143-4b6d-ab53-9b3b5f1f887a\" (UID: \"a91ab773-e143-4b6d-ab53-9b3b5f1f887a\") " Apr 17 16:52:32.814113 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:32.814106 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-466a7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a91ab773-e143-4b6d-ab53-9b3b5f1f887a-raw-sklearn-466a7-kube-rbac-proxy-sar-config\") pod \"a91ab773-e143-4b6d-ab53-9b3b5f1f887a\" (UID: \"a91ab773-e143-4b6d-ab53-9b3b5f1f887a\") " Apr 17 16:52:32.814408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:32.814142 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58gld\" (UniqueName: \"kubernetes.io/projected/a91ab773-e143-4b6d-ab53-9b3b5f1f887a-kube-api-access-58gld\") pod \"a91ab773-e143-4b6d-ab53-9b3b5f1f887a\" (UID: \"a91ab773-e143-4b6d-ab53-9b3b5f1f887a\") " Apr 17 16:52:32.814408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:32.814168 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a91ab773-e143-4b6d-ab53-9b3b5f1f887a-kserve-provision-location\") pod \"a91ab773-e143-4b6d-ab53-9b3b5f1f887a\" (UID: \"a91ab773-e143-4b6d-ab53-9b3b5f1f887a\") " Apr 17 16:52:32.814550 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:32.814521 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a91ab773-e143-4b6d-ab53-9b3b5f1f887a-raw-sklearn-466a7-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-466a7-kube-rbac-proxy-sar-config") pod "a91ab773-e143-4b6d-ab53-9b3b5f1f887a" (UID: "a91ab773-e143-4b6d-ab53-9b3b5f1f887a"). InnerVolumeSpecName "raw-sklearn-466a7-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:52:32.814615 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:32.814527 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a91ab773-e143-4b6d-ab53-9b3b5f1f887a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a91ab773-e143-4b6d-ab53-9b3b5f1f887a" (UID: "a91ab773-e143-4b6d-ab53-9b3b5f1f887a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:52:32.816266 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:32.816218 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a91ab773-e143-4b6d-ab53-9b3b5f1f887a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a91ab773-e143-4b6d-ab53-9b3b5f1f887a" (UID: "a91ab773-e143-4b6d-ab53-9b3b5f1f887a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:52:32.816375 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:32.816287 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a91ab773-e143-4b6d-ab53-9b3b5f1f887a-kube-api-access-58gld" (OuterVolumeSpecName: "kube-api-access-58gld") pod "a91ab773-e143-4b6d-ab53-9b3b5f1f887a" (UID: "a91ab773-e143-4b6d-ab53-9b3b5f1f887a"). InnerVolumeSpecName "kube-api-access-58gld". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:52:32.915668 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:32.915626 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a91ab773-e143-4b6d-ab53-9b3b5f1f887a-proxy-tls\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:52:32.915668 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:32.915661 2569 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-466a7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a91ab773-e143-4b6d-ab53-9b3b5f1f887a-raw-sklearn-466a7-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:52:32.915668 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:32.915672 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-58gld\" (UniqueName: \"kubernetes.io/projected/a91ab773-e143-4b6d-ab53-9b3b5f1f887a-kube-api-access-58gld\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:52:32.915922 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:32.915681 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a91ab773-e143-4b6d-ab53-9b3b5f1f887a-kserve-provision-location\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:52:33.030534 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:33.030504 2569 generic.go:358] "Generic (PLEG): container finished" podID="99f446c2-5bd3-45dd-bb5f-6bc80c44195a" containerID="d2b8f083140369e2c6a37a725cf3030778484ee59b163db5a1684a42aa03a3ac" exitCode=0 Apr 17 16:52:33.030705 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:33.030575 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" event={"ID":"99f446c2-5bd3-45dd-bb5f-6bc80c44195a","Type":"ContainerDied","Data":"d2b8f083140369e2c6a37a725cf3030778484ee59b163db5a1684a42aa03a3ac"} Apr 17 16:52:33.032329 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:33.032294 2569 generic.go:358] "Generic (PLEG): container finished" podID="a91ab773-e143-4b6d-ab53-9b3b5f1f887a" containerID="70504792f6cf31399573ba9c747418a981f2a79b3c4c8449296622c7a3b1e6c1" exitCode=0 Apr 17 16:52:33.032450 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:33.032333 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" event={"ID":"a91ab773-e143-4b6d-ab53-9b3b5f1f887a","Type":"ContainerDied","Data":"70504792f6cf31399573ba9c747418a981f2a79b3c4c8449296622c7a3b1e6c1"} Apr 17 16:52:33.032450 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:33.032363 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" Apr 17 16:52:33.032450 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:33.032373 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm" event={"ID":"a91ab773-e143-4b6d-ab53-9b3b5f1f887a","Type":"ContainerDied","Data":"4abc5437c9b0b24cee5192d868167e110138852780c0c912999f6c5a8a317118"} Apr 17 16:52:33.032450 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:33.032396 2569 scope.go:117] "RemoveContainer" containerID="2e62f8e6ade98f2eb10971d5d28150b78a530c5ed066e4a3db518773c763bc83" Apr 17 16:52:33.043994 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:33.043975 2569 scope.go:117] "RemoveContainer" containerID="70504792f6cf31399573ba9c747418a981f2a79b3c4c8449296622c7a3b1e6c1" Apr 17 16:52:33.052264 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:33.052227 2569 scope.go:117] "RemoveContainer" containerID="2d63e4db31a46fd25e44c279bcfc0dfadf3e5a0c203c185f73b0d6dff1a22e1b" Apr 17 16:52:33.061048 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:33.061029 2569 scope.go:117] "RemoveContainer" containerID="2e62f8e6ade98f2eb10971d5d28150b78a530c5ed066e4a3db518773c763bc83" Apr 17 16:52:33.061308 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:52:33.061291 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e62f8e6ade98f2eb10971d5d28150b78a530c5ed066e4a3db518773c763bc83\": container with ID starting with 2e62f8e6ade98f2eb10971d5d28150b78a530c5ed066e4a3db518773c763bc83 not found: ID does not exist" containerID="2e62f8e6ade98f2eb10971d5d28150b78a530c5ed066e4a3db518773c763bc83" Apr 17 16:52:33.061362 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:33.061318 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e62f8e6ade98f2eb10971d5d28150b78a530c5ed066e4a3db518773c763bc83"} err="failed to get container status \"2e62f8e6ade98f2eb10971d5d28150b78a530c5ed066e4a3db518773c763bc83\": rpc error: code = NotFound desc = could not find container \"2e62f8e6ade98f2eb10971d5d28150b78a530c5ed066e4a3db518773c763bc83\": container with ID starting with 2e62f8e6ade98f2eb10971d5d28150b78a530c5ed066e4a3db518773c763bc83 not found: ID does not exist" Apr 17 16:52:33.061362 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:33.061336 2569 scope.go:117] "RemoveContainer" containerID="70504792f6cf31399573ba9c747418a981f2a79b3c4c8449296622c7a3b1e6c1" Apr 17 16:52:33.061558 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:52:33.061539 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70504792f6cf31399573ba9c747418a981f2a79b3c4c8449296622c7a3b1e6c1\": container with ID starting with 70504792f6cf31399573ba9c747418a981f2a79b3c4c8449296622c7a3b1e6c1 not found: ID does not exist" containerID="70504792f6cf31399573ba9c747418a981f2a79b3c4c8449296622c7a3b1e6c1" Apr 17 16:52:33.061618 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:33.061568 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70504792f6cf31399573ba9c747418a981f2a79b3c4c8449296622c7a3b1e6c1"} err="failed to get container status \"70504792f6cf31399573ba9c747418a981f2a79b3c4c8449296622c7a3b1e6c1\": rpc error: code = NotFound desc = could not find container \"70504792f6cf31399573ba9c747418a981f2a79b3c4c8449296622c7a3b1e6c1\": container with ID starting with 70504792f6cf31399573ba9c747418a981f2a79b3c4c8449296622c7a3b1e6c1 not found: ID does not exist" Apr 17 16:52:33.061618 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:33.061594 2569 scope.go:117] "RemoveContainer" containerID="2d63e4db31a46fd25e44c279bcfc0dfadf3e5a0c203c185f73b0d6dff1a22e1b" Apr 17 16:52:33.061826 ip-10-0-135-127 kubenswrapper[2569]: E0417 16:52:33.061809 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d63e4db31a46fd25e44c279bcfc0dfadf3e5a0c203c185f73b0d6dff1a22e1b\": container with ID starting with 2d63e4db31a46fd25e44c279bcfc0dfadf3e5a0c203c185f73b0d6dff1a22e1b not found: ID does not exist" containerID="2d63e4db31a46fd25e44c279bcfc0dfadf3e5a0c203c185f73b0d6dff1a22e1b" Apr 17 16:52:33.061869 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:33.061831 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d63e4db31a46fd25e44c279bcfc0dfadf3e5a0c203c185f73b0d6dff1a22e1b"} err="failed to get container status \"2d63e4db31a46fd25e44c279bcfc0dfadf3e5a0c203c185f73b0d6dff1a22e1b\": rpc error: code = NotFound desc = could not find container \"2d63e4db31a46fd25e44c279bcfc0dfadf3e5a0c203c185f73b0d6dff1a22e1b\": container with ID starting with 2d63e4db31a46fd25e44c279bcfc0dfadf3e5a0c203c185f73b0d6dff1a22e1b not found: ID does not exist" Apr 17 16:52:33.068051 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:33.067894 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm"] Apr 17 16:52:33.069368 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:33.069345 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-466a7-predictor-6cffc9b96c-vq2rm"] Apr 17 16:52:34.038239 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:34.038201 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" event={"ID":"99f446c2-5bd3-45dd-bb5f-6bc80c44195a","Type":"ContainerStarted","Data":"a110d8e2a3f477c25deb0f715c8fbbbb72560e860bd85d2fc379abebb88a7afe"} Apr 17 16:52:34.038677 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:34.038241 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" event={"ID":"99f446c2-5bd3-45dd-bb5f-6bc80c44195a","Type":"ContainerStarted","Data":"a24c4426a3cdd3497b0e3826c2c23cf30f7f6d9fa966e63097f5b83019cd3d01"} Apr 17 16:52:34.038677 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:34.038467 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" Apr 17 16:52:34.057881 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:34.057832 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" podStartSLOduration=6.057818441 podStartE2EDuration="6.057818441s" podCreationTimestamp="2026-04-17 16:52:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:52:34.056667965 +0000 UTC m=+1251.579693550" watchObservedRunningTime="2026-04-17 16:52:34.057818441 +0000 UTC m=+1251.580844081" Apr 17 16:52:35.040678 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:35.040639 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a91ab773-e143-4b6d-ab53-9b3b5f1f887a" path="/var/lib/kubelet/pods/a91ab773-e143-4b6d-ab53-9b3b5f1f887a/volumes" Apr 17 16:52:35.041154 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:35.041106 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" Apr 17 16:52:35.042352 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:35.042327 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" podUID="99f446c2-5bd3-45dd-bb5f-6bc80c44195a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 17 16:52:36.048974 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:36.048925 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" podUID="99f446c2-5bd3-45dd-bb5f-6bc80c44195a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 17 16:52:41.052206 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:41.052179 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" Apr 17 16:52:41.052701 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:41.052672 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" podUID="99f446c2-5bd3-45dd-bb5f-6bc80c44195a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 17 16:52:51.053090 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:52:51.053051 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" podUID="99f446c2-5bd3-45dd-bb5f-6bc80c44195a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 17 16:53:01.053542 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:01.053497 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" podUID="99f446c2-5bd3-45dd-bb5f-6bc80c44195a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 17 16:53:11.053316 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:11.053269 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" podUID="99f446c2-5bd3-45dd-bb5f-6bc80c44195a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 17 16:53:21.053005 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:21.052966 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" podUID="99f446c2-5bd3-45dd-bb5f-6bc80c44195a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 17 16:53:31.053199 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:31.053161 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" podUID="99f446c2-5bd3-45dd-bb5f-6bc80c44195a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 17 16:53:41.053436 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:41.053404 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" Apr 17 16:53:48.379292 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:48.379240 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t"] Apr 17 16:53:48.379758 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:48.379551 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" podUID="99f446c2-5bd3-45dd-bb5f-6bc80c44195a" containerName="kserve-container" containerID="cri-o://a24c4426a3cdd3497b0e3826c2c23cf30f7f6d9fa966e63097f5b83019cd3d01" gracePeriod=30 Apr 17 16:53:48.379758 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:48.379588 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" podUID="99f446c2-5bd3-45dd-bb5f-6bc80c44195a" containerName="kube-rbac-proxy" containerID="cri-o://a110d8e2a3f477c25deb0f715c8fbbbb72560e860bd85d2fc379abebb88a7afe" gracePeriod=30 Apr 17 16:53:49.259572 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:49.259537 2569 generic.go:358] "Generic (PLEG): container finished" podID="99f446c2-5bd3-45dd-bb5f-6bc80c44195a" containerID="a110d8e2a3f477c25deb0f715c8fbbbb72560e860bd85d2fc379abebb88a7afe" exitCode=2 Apr 17 16:53:49.259572 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:49.259580 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" event={"ID":"99f446c2-5bd3-45dd-bb5f-6bc80c44195a","Type":"ContainerDied","Data":"a110d8e2a3f477c25deb0f715c8fbbbb72560e860bd85d2fc379abebb88a7afe"} Apr 17 16:53:49.747733 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:49.747655 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ph2g8/must-gather-x54lb"] Apr 17 16:53:49.748208 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:49.748132 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a91ab773-e143-4b6d-ab53-9b3b5f1f887a" containerName="storage-initializer" Apr 17 16:53:49.748208 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:49.748150 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a91ab773-e143-4b6d-ab53-9b3b5f1f887a" containerName="storage-initializer" Apr 17 16:53:49.748208 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:49.748167 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a91ab773-e143-4b6d-ab53-9b3b5f1f887a" containerName="kserve-container" Apr 17 16:53:49.748208 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:49.748176 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a91ab773-e143-4b6d-ab53-9b3b5f1f887a" containerName="kserve-container" Apr 17 16:53:49.748208 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:49.748203 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a91ab773-e143-4b6d-ab53-9b3b5f1f887a" containerName="kube-rbac-proxy" Apr 17 16:53:49.748521 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:49.748212 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a91ab773-e143-4b6d-ab53-9b3b5f1f887a" containerName="kube-rbac-proxy" Apr 17 16:53:49.748521 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:49.748322 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a91ab773-e143-4b6d-ab53-9b3b5f1f887a" containerName="kube-rbac-proxy" Apr 17 16:53:49.748521 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:49.748337 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a91ab773-e143-4b6d-ab53-9b3b5f1f887a" containerName="kserve-container" Apr 17 16:53:49.751519 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:49.751490 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ph2g8/must-gather-x54lb" Apr 17 16:53:49.754372 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:49.754346 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-ph2g8\"/\"default-dockercfg-92cln\"" Apr 17 16:53:49.754514 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:49.754400 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-ph2g8\"/\"kube-root-ca.crt\"" Apr 17 16:53:49.754514 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:49.754405 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-ph2g8\"/\"openshift-service-ca.crt\"" Apr 17 16:53:49.760639 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:49.760612 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ph2g8/must-gather-x54lb"] Apr 17 16:53:49.842645 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:49.842614 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98gvd\" (UniqueName: \"kubernetes.io/projected/f58f17c2-5caf-4540-bd65-8b76a71b4a07-kube-api-access-98gvd\") pod \"must-gather-x54lb\" (UID: \"f58f17c2-5caf-4540-bd65-8b76a71b4a07\") " pod="openshift-must-gather-ph2g8/must-gather-x54lb" Apr 17 16:53:49.842847 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:49.842722 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f58f17c2-5caf-4540-bd65-8b76a71b4a07-must-gather-output\") pod \"must-gather-x54lb\" (UID: \"f58f17c2-5caf-4540-bd65-8b76a71b4a07\") " pod="openshift-must-gather-ph2g8/must-gather-x54lb" Apr 17 16:53:49.943654 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:49.943617 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f58f17c2-5caf-4540-bd65-8b76a71b4a07-must-gather-output\") pod \"must-gather-x54lb\" (UID: \"f58f17c2-5caf-4540-bd65-8b76a71b4a07\") " pod="openshift-must-gather-ph2g8/must-gather-x54lb" Apr 17 16:53:49.943822 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:49.943664 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98gvd\" (UniqueName: \"kubernetes.io/projected/f58f17c2-5caf-4540-bd65-8b76a71b4a07-kube-api-access-98gvd\") pod \"must-gather-x54lb\" (UID: \"f58f17c2-5caf-4540-bd65-8b76a71b4a07\") " pod="openshift-must-gather-ph2g8/must-gather-x54lb" Apr 17 16:53:49.943954 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:49.943933 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f58f17c2-5caf-4540-bd65-8b76a71b4a07-must-gather-output\") pod \"must-gather-x54lb\" (UID: \"f58f17c2-5caf-4540-bd65-8b76a71b4a07\") " pod="openshift-must-gather-ph2g8/must-gather-x54lb" Apr 17 16:53:49.954998 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:49.954968 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-98gvd\" (UniqueName: \"kubernetes.io/projected/f58f17c2-5caf-4540-bd65-8b76a71b4a07-kube-api-access-98gvd\") pod \"must-gather-x54lb\" (UID: \"f58f17c2-5caf-4540-bd65-8b76a71b4a07\") " pod="openshift-must-gather-ph2g8/must-gather-x54lb" Apr 17 16:53:50.070241 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:50.070151 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ph2g8/must-gather-x54lb" Apr 17 16:53:50.188396 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:50.188358 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ph2g8/must-gather-x54lb"] Apr 17 16:53:50.191713 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:53:50.191675 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf58f17c2_5caf_4540_bd65_8b76a71b4a07.slice/crio-e30700a318b28b4e1e47c10c3f3bb523d95386a91f0dac9e211b08655db44280 WatchSource:0}: Error finding container e30700a318b28b4e1e47c10c3f3bb523d95386a91f0dac9e211b08655db44280: Status 404 returned error can't find the container with id e30700a318b28b4e1e47c10c3f3bb523d95386a91f0dac9e211b08655db44280 Apr 17 16:53:50.263744 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:50.263706 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ph2g8/must-gather-x54lb" event={"ID":"f58f17c2-5caf-4540-bd65-8b76a71b4a07","Type":"ContainerStarted","Data":"e30700a318b28b4e1e47c10c3f3bb523d95386a91f0dac9e211b08655db44280"} Apr 17 16:53:51.047789 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:51.047748 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" podUID="99f446c2-5bd3-45dd-bb5f-6bc80c44195a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.32:8643/healthz\": dial tcp 10.134.0.32:8643: connect: connection refused" Apr 17 16:53:51.052788 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:51.052751 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" podUID="99f446c2-5bd3-45dd-bb5f-6bc80c44195a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 17 16:53:54.279602 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:54.279568 2569 generic.go:358] "Generic (PLEG): container finished" podID="99f446c2-5bd3-45dd-bb5f-6bc80c44195a" containerID="a24c4426a3cdd3497b0e3826c2c23cf30f7f6d9fa966e63097f5b83019cd3d01" exitCode=0 Apr 17 16:53:54.280016 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:54.279650 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" event={"ID":"99f446c2-5bd3-45dd-bb5f-6bc80c44195a","Type":"ContainerDied","Data":"a24c4426a3cdd3497b0e3826c2c23cf30f7f6d9fa966e63097f5b83019cd3d01"} Apr 17 16:53:54.360604 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:54.360581 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" Apr 17 16:53:54.482322 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:54.482289 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99f446c2-5bd3-45dd-bb5f-6bc80c44195a-kserve-provision-location\") pod \"99f446c2-5bd3-45dd-bb5f-6bc80c44195a\" (UID: \"99f446c2-5bd3-45dd-bb5f-6bc80c44195a\") " Apr 17 16:53:54.482520 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:54.482338 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99f446c2-5bd3-45dd-bb5f-6bc80c44195a-proxy-tls\") pod \"99f446c2-5bd3-45dd-bb5f-6bc80c44195a\" (UID: \"99f446c2-5bd3-45dd-bb5f-6bc80c44195a\") " Apr 17 16:53:54.482520 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:54.482374 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnsb8\" (UniqueName: \"kubernetes.io/projected/99f446c2-5bd3-45dd-bb5f-6bc80c44195a-kube-api-access-hnsb8\") pod \"99f446c2-5bd3-45dd-bb5f-6bc80c44195a\" (UID: \"99f446c2-5bd3-45dd-bb5f-6bc80c44195a\") " Apr 17 16:53:54.482520 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:54.482409 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-runtime-984d3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99f446c2-5bd3-45dd-bb5f-6bc80c44195a-raw-sklearn-runtime-984d3-kube-rbac-proxy-sar-config\") pod \"99f446c2-5bd3-45dd-bb5f-6bc80c44195a\" (UID: \"99f446c2-5bd3-45dd-bb5f-6bc80c44195a\") " Apr 17 16:53:54.482775 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:54.482745 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99f446c2-5bd3-45dd-bb5f-6bc80c44195a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "99f446c2-5bd3-45dd-bb5f-6bc80c44195a" (UID: "99f446c2-5bd3-45dd-bb5f-6bc80c44195a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:53:54.482892 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:54.482837 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99f446c2-5bd3-45dd-bb5f-6bc80c44195a-raw-sklearn-runtime-984d3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-runtime-984d3-kube-rbac-proxy-sar-config") pod "99f446c2-5bd3-45dd-bb5f-6bc80c44195a" (UID: "99f446c2-5bd3-45dd-bb5f-6bc80c44195a"). InnerVolumeSpecName "raw-sklearn-runtime-984d3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:53:54.484850 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:54.484825 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99f446c2-5bd3-45dd-bb5f-6bc80c44195a-kube-api-access-hnsb8" (OuterVolumeSpecName: "kube-api-access-hnsb8") pod "99f446c2-5bd3-45dd-bb5f-6bc80c44195a" (UID: "99f446c2-5bd3-45dd-bb5f-6bc80c44195a"). InnerVolumeSpecName "kube-api-access-hnsb8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:53:54.484979 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:54.484937 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99f446c2-5bd3-45dd-bb5f-6bc80c44195a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "99f446c2-5bd3-45dd-bb5f-6bc80c44195a" (UID: "99f446c2-5bd3-45dd-bb5f-6bc80c44195a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:53:54.583176 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:54.583101 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hnsb8\" (UniqueName: \"kubernetes.io/projected/99f446c2-5bd3-45dd-bb5f-6bc80c44195a-kube-api-access-hnsb8\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:53:54.583176 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:54.583133 2569 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-runtime-984d3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99f446c2-5bd3-45dd-bb5f-6bc80c44195a-raw-sklearn-runtime-984d3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:53:54.583176 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:54.583145 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99f446c2-5bd3-45dd-bb5f-6bc80c44195a-kserve-provision-location\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:53:54.583176 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:54.583154 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99f446c2-5bd3-45dd-bb5f-6bc80c44195a-proxy-tls\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:53:55.284026 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:55.283982 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ph2g8/must-gather-x54lb" event={"ID":"f58f17c2-5caf-4540-bd65-8b76a71b4a07","Type":"ContainerStarted","Data":"9594a7c8042fa2133a1dd615eab90ffe7b4fb8261c5b870cc789043eb65fa5ae"} Apr 17 16:53:55.284499 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:55.284033 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ph2g8/must-gather-x54lb" event={"ID":"f58f17c2-5caf-4540-bd65-8b76a71b4a07","Type":"ContainerStarted","Data":"2173e0f9a9543929cc9a8570c0cc618a25438508a9ac3707ebc65997ba257608"} Apr 17 16:53:55.286317 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:55.286285 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" event={"ID":"99f446c2-5bd3-45dd-bb5f-6bc80c44195a","Type":"ContainerDied","Data":"832017d748856b2edf8eacf28700665507dba8086b0cc0a691e91d245e5be2d6"} Apr 17 16:53:55.286447 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:55.286337 2569 scope.go:117] "RemoveContainer" containerID="a110d8e2a3f477c25deb0f715c8fbbbb72560e860bd85d2fc379abebb88a7afe" Apr 17 16:53:55.286447 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:55.286361 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t" Apr 17 16:53:55.294399 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:55.294375 2569 scope.go:117] "RemoveContainer" containerID="a24c4426a3cdd3497b0e3826c2c23cf30f7f6d9fa966e63097f5b83019cd3d01" Apr 17 16:53:55.299818 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:55.299769 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ph2g8/must-gather-x54lb" podStartSLOduration=1.8678046959999999 podStartE2EDuration="6.29975271s" podCreationTimestamp="2026-04-17 16:53:49 +0000 UTC" firstStartedPulling="2026-04-17 16:53:50.193382343 +0000 UTC m=+1327.716407908" lastFinishedPulling="2026-04-17 16:53:54.62533036 +0000 UTC m=+1332.148355922" observedRunningTime="2026-04-17 16:53:55.298222296 +0000 UTC m=+1332.821247892" watchObservedRunningTime="2026-04-17 16:53:55.29975271 +0000 UTC m=+1332.822778295" Apr 17 16:53:55.303215 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:55.303194 2569 scope.go:117] "RemoveContainer" containerID="d2b8f083140369e2c6a37a725cf3030778484ee59b163db5a1684a42aa03a3ac" Apr 17 16:53:55.311311 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:55.311286 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t"] Apr 17 16:53:55.314745 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:55.314723 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-984d3-predictor-6786bd7bf-r9s4t"] Apr 17 16:53:57.042295 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:53:57.042201 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99f446c2-5bd3-45dd-bb5f-6bc80c44195a" path="/var/lib/kubelet/pods/99f446c2-5bd3-45dd-bb5f-6bc80c44195a/volumes" Apr 17 16:54:13.347193 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:13.347156 2569 generic.go:358] "Generic (PLEG): container finished" podID="f58f17c2-5caf-4540-bd65-8b76a71b4a07" containerID="2173e0f9a9543929cc9a8570c0cc618a25438508a9ac3707ebc65997ba257608" exitCode=0 Apr 17 16:54:13.347648 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:13.347240 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ph2g8/must-gather-x54lb" event={"ID":"f58f17c2-5caf-4540-bd65-8b76a71b4a07","Type":"ContainerDied","Data":"2173e0f9a9543929cc9a8570c0cc618a25438508a9ac3707ebc65997ba257608"} Apr 17 16:54:13.347648 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:13.347585 2569 scope.go:117] "RemoveContainer" containerID="2173e0f9a9543929cc9a8570c0cc618a25438508a9ac3707ebc65997ba257608" Apr 17 16:54:13.716701 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:13.716671 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ph2g8_must-gather-x54lb_f58f17c2-5caf-4540-bd65-8b76a71b4a07/gather/0.log" Apr 17 16:54:14.257236 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:14.257203 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-l7rfq/must-gather-6sxwc"] Apr 17 16:54:14.257570 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:14.257557 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99f446c2-5bd3-45dd-bb5f-6bc80c44195a" containerName="kserve-container" Apr 17 16:54:14.257618 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:14.257572 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="99f446c2-5bd3-45dd-bb5f-6bc80c44195a" containerName="kserve-container" Apr 17 16:54:14.257618 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:14.257591 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99f446c2-5bd3-45dd-bb5f-6bc80c44195a" containerName="storage-initializer" Apr 17 16:54:14.257618 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:14.257599 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="99f446c2-5bd3-45dd-bb5f-6bc80c44195a" containerName="storage-initializer" Apr 17 16:54:14.257618 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:14.257607 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99f446c2-5bd3-45dd-bb5f-6bc80c44195a" containerName="kube-rbac-proxy" Apr 17 16:54:14.257618 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:14.257614 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="99f446c2-5bd3-45dd-bb5f-6bc80c44195a" containerName="kube-rbac-proxy" Apr 17 16:54:14.257776 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:14.257667 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="99f446c2-5bd3-45dd-bb5f-6bc80c44195a" containerName="kube-rbac-proxy" Apr 17 16:54:14.257776 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:14.257679 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="99f446c2-5bd3-45dd-bb5f-6bc80c44195a" containerName="kserve-container" Apr 17 16:54:14.260831 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:14.260812 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l7rfq/must-gather-6sxwc" Apr 17 16:54:14.263265 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:14.263231 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-l7rfq\"/\"kube-root-ca.crt\"" Apr 17 16:54:14.264212 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:14.264196 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-l7rfq\"/\"openshift-service-ca.crt\"" Apr 17 16:54:14.264324 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:14.264228 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-l7rfq\"/\"default-dockercfg-9vr2t\"" Apr 17 16:54:14.267509 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:14.267486 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-l7rfq/must-gather-6sxwc"] Apr 17 16:54:14.357583 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:14.357553 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/61e118bc-10ed-4bcf-bc8d-6bfb4be7875a-must-gather-output\") pod \"must-gather-6sxwc\" (UID: \"61e118bc-10ed-4bcf-bc8d-6bfb4be7875a\") " pod="openshift-must-gather-l7rfq/must-gather-6sxwc" Apr 17 16:54:14.357990 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:14.357692 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62b68\" (UniqueName: \"kubernetes.io/projected/61e118bc-10ed-4bcf-bc8d-6bfb4be7875a-kube-api-access-62b68\") pod \"must-gather-6sxwc\" (UID: \"61e118bc-10ed-4bcf-bc8d-6bfb4be7875a\") " pod="openshift-must-gather-l7rfq/must-gather-6sxwc" Apr 17 16:54:14.458367 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:14.458334 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62b68\" (UniqueName: \"kubernetes.io/projected/61e118bc-10ed-4bcf-bc8d-6bfb4be7875a-kube-api-access-62b68\") pod \"must-gather-6sxwc\" (UID: \"61e118bc-10ed-4bcf-bc8d-6bfb4be7875a\") " pod="openshift-must-gather-l7rfq/must-gather-6sxwc" Apr 17 16:54:14.458518 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:14.458409 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/61e118bc-10ed-4bcf-bc8d-6bfb4be7875a-must-gather-output\") pod \"must-gather-6sxwc\" (UID: \"61e118bc-10ed-4bcf-bc8d-6bfb4be7875a\") " pod="openshift-must-gather-l7rfq/must-gather-6sxwc" Apr 17 16:54:14.458775 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:14.458756 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/61e118bc-10ed-4bcf-bc8d-6bfb4be7875a-must-gather-output\") pod \"must-gather-6sxwc\" (UID: \"61e118bc-10ed-4bcf-bc8d-6bfb4be7875a\") " pod="openshift-must-gather-l7rfq/must-gather-6sxwc" Apr 17 16:54:14.466207 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:14.466171 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62b68\" (UniqueName: \"kubernetes.io/projected/61e118bc-10ed-4bcf-bc8d-6bfb4be7875a-kube-api-access-62b68\") pod \"must-gather-6sxwc\" (UID: \"61e118bc-10ed-4bcf-bc8d-6bfb4be7875a\") " pod="openshift-must-gather-l7rfq/must-gather-6sxwc" Apr 17 16:54:14.570428 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:14.570341 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l7rfq/must-gather-6sxwc" Apr 17 16:54:14.686591 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:14.686561 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-l7rfq/must-gather-6sxwc"] Apr 17 16:54:14.690020 ip-10-0-135-127 kubenswrapper[2569]: W0417 16:54:14.689992 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61e118bc_10ed_4bcf_bc8d_6bfb4be7875a.slice/crio-42d1ac659eeb1d4118f643440b1da499ebd974c27065adf9bcc15961700980bf WatchSource:0}: Error finding container 42d1ac659eeb1d4118f643440b1da499ebd974c27065adf9bcc15961700980bf: Status 404 returned error can't find the container with id 42d1ac659eeb1d4118f643440b1da499ebd974c27065adf9bcc15961700980bf Apr 17 16:54:15.354200 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:15.354161 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l7rfq/must-gather-6sxwc" event={"ID":"61e118bc-10ed-4bcf-bc8d-6bfb4be7875a","Type":"ContainerStarted","Data":"42d1ac659eeb1d4118f643440b1da499ebd974c27065adf9bcc15961700980bf"} Apr 17 16:54:16.358839 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:16.358759 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l7rfq/must-gather-6sxwc" event={"ID":"61e118bc-10ed-4bcf-bc8d-6bfb4be7875a","Type":"ContainerStarted","Data":"deb293cae1e6d93d15da21c7290e118d68e588fc4d8b2cf7f30be17440d89354"} Apr 17 16:54:16.358839 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:16.358806 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l7rfq/must-gather-6sxwc" event={"ID":"61e118bc-10ed-4bcf-bc8d-6bfb4be7875a","Type":"ContainerStarted","Data":"4f28aad3e8cfd34129d12ca2e1a9842814783261bd3d9ccdb80ec9f0bc065a71"} Apr 17 16:54:16.375719 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:16.375654 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-l7rfq/must-gather-6sxwc" podStartSLOduration=1.525880303 podStartE2EDuration="2.375636698s" podCreationTimestamp="2026-04-17 16:54:14 +0000 UTC" firstStartedPulling="2026-04-17 16:54:14.692278789 +0000 UTC m=+1352.215304354" lastFinishedPulling="2026-04-17 16:54:15.542035187 +0000 UTC m=+1353.065060749" observedRunningTime="2026-04-17 16:54:16.374636037 +0000 UTC m=+1353.897661631" watchObservedRunningTime="2026-04-17 16:54:16.375636698 +0000 UTC m=+1353.898662282" Apr 17 16:54:16.998187 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:16.998152 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-nkvf5_f82da78f-1d1f-490b-b80e-531065555833/global-pull-secret-syncer/0.log" Apr 17 16:54:17.142871 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:17.142839 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-jp52r_7051a978-dd0e-480e-93f4-b48b1dda0f32/konnectivity-agent/0.log" Apr 17 16:54:17.195790 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:17.195755 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-127.ec2.internal_356446819b043d77b4ba2d5504f23404/haproxy/0.log" Apr 17 16:54:19.104314 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:19.104273 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ph2g8/must-gather-x54lb"] Apr 17 16:54:19.104915 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:19.104571 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-ph2g8/must-gather-x54lb" podUID="f58f17c2-5caf-4540-bd65-8b76a71b4a07" containerName="copy" containerID="cri-o://9594a7c8042fa2133a1dd615eab90ffe7b4fb8261c5b870cc789043eb65fa5ae" gracePeriod=2 Apr 17 16:54:19.108615 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:19.108587 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ph2g8/must-gather-x54lb"] Apr 17 16:54:19.109035 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:19.109011 2569 status_manager.go:895] "Failed to get status for pod" podUID="f58f17c2-5caf-4540-bd65-8b76a71b4a07" pod="openshift-must-gather-ph2g8/must-gather-x54lb" err="pods \"must-gather-x54lb\" is forbidden: User \"system:node:ip-10-0-135-127.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-ph2g8\": no relationship found between node 'ip-10-0-135-127.ec2.internal' and this object" Apr 17 16:54:19.381319 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:19.380449 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ph2g8_must-gather-x54lb_f58f17c2-5caf-4540-bd65-8b76a71b4a07/copy/0.log" Apr 17 16:54:19.381319 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:19.380832 2569 generic.go:358] "Generic (PLEG): container finished" podID="f58f17c2-5caf-4540-bd65-8b76a71b4a07" containerID="9594a7c8042fa2133a1dd615eab90ffe7b4fb8261c5b870cc789043eb65fa5ae" exitCode=143 Apr 17 16:54:19.465281 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:19.464504 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ph2g8_must-gather-x54lb_f58f17c2-5caf-4540-bd65-8b76a71b4a07/copy/0.log" Apr 17 16:54:19.465281 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:19.464927 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ph2g8/must-gather-x54lb" Apr 17 16:54:19.471525 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:19.470463 2569 status_manager.go:895] "Failed to get status for pod" podUID="f58f17c2-5caf-4540-bd65-8b76a71b4a07" pod="openshift-must-gather-ph2g8/must-gather-x54lb" err="pods \"must-gather-x54lb\" is forbidden: User \"system:node:ip-10-0-135-127.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-ph2g8\": no relationship found between node 'ip-10-0-135-127.ec2.internal' and this object" Apr 17 16:54:19.514276 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:19.514112 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f58f17c2-5caf-4540-bd65-8b76a71b4a07-must-gather-output\") pod \"f58f17c2-5caf-4540-bd65-8b76a71b4a07\" (UID: \"f58f17c2-5caf-4540-bd65-8b76a71b4a07\") " Apr 17 16:54:19.514276 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:19.514172 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98gvd\" (UniqueName: \"kubernetes.io/projected/f58f17c2-5caf-4540-bd65-8b76a71b4a07-kube-api-access-98gvd\") pod \"f58f17c2-5caf-4540-bd65-8b76a71b4a07\" (UID: \"f58f17c2-5caf-4540-bd65-8b76a71b4a07\") " Apr 17 16:54:19.518284 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:19.515993 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f58f17c2-5caf-4540-bd65-8b76a71b4a07-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f58f17c2-5caf-4540-bd65-8b76a71b4a07" (UID: "f58f17c2-5caf-4540-bd65-8b76a71b4a07"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:54:19.533055 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:19.530371 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f58f17c2-5caf-4540-bd65-8b76a71b4a07-kube-api-access-98gvd" (OuterVolumeSpecName: "kube-api-access-98gvd") pod "f58f17c2-5caf-4540-bd65-8b76a71b4a07" (UID: "f58f17c2-5caf-4540-bd65-8b76a71b4a07"). InnerVolumeSpecName "kube-api-access-98gvd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:54:19.615173 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:19.615130 2569 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f58f17c2-5caf-4540-bd65-8b76a71b4a07-must-gather-output\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:54:19.615173 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:19.615172 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-98gvd\" (UniqueName: \"kubernetes.io/projected/f58f17c2-5caf-4540-bd65-8b76a71b4a07-kube-api-access-98gvd\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 16:54:20.386169 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:20.386074 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ph2g8_must-gather-x54lb_f58f17c2-5caf-4540-bd65-8b76a71b4a07/copy/0.log" Apr 17 16:54:20.386747 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:20.386583 2569 scope.go:117] "RemoveContainer" containerID="9594a7c8042fa2133a1dd615eab90ffe7b4fb8261c5b870cc789043eb65fa5ae" Apr 17 16:54:20.386747 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:20.386738 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ph2g8/must-gather-x54lb" Apr 17 16:54:20.389332 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:20.389279 2569 status_manager.go:895] "Failed to get status for pod" podUID="f58f17c2-5caf-4540-bd65-8b76a71b4a07" pod="openshift-must-gather-ph2g8/must-gather-x54lb" err="pods \"must-gather-x54lb\" is forbidden: User \"system:node:ip-10-0-135-127.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-ph2g8\": no relationship found between node 'ip-10-0-135-127.ec2.internal' and this object" Apr 17 16:54:20.397482 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:20.397457 2569 scope.go:117] "RemoveContainer" containerID="2173e0f9a9543929cc9a8570c0cc618a25438508a9ac3707ebc65997ba257608" Apr 17 16:54:20.407496 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:20.407456 2569 status_manager.go:895] "Failed to get status for pod" podUID="f58f17c2-5caf-4540-bd65-8b76a71b4a07" pod="openshift-must-gather-ph2g8/must-gather-x54lb" err="pods \"must-gather-x54lb\" is forbidden: User \"system:node:ip-10-0-135-127.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-ph2g8\": no relationship found between node 'ip-10-0-135-127.ec2.internal' and this object" Apr 17 16:54:20.914208 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:20.914171 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7cf048f7-68e4-4bc9-beac-730ca8f13ceb/alertmanager/0.log" Apr 17 16:54:20.938415 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:20.938215 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7cf048f7-68e4-4bc9-beac-730ca8f13ceb/config-reloader/0.log" Apr 17 16:54:20.968470 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:20.968313 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7cf048f7-68e4-4bc9-beac-730ca8f13ceb/kube-rbac-proxy-web/0.log" Apr 17 16:54:20.991705 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:20.991526 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7cf048f7-68e4-4bc9-beac-730ca8f13ceb/kube-rbac-proxy/0.log" Apr 17 16:54:21.017403 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:21.017181 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7cf048f7-68e4-4bc9-beac-730ca8f13ceb/kube-rbac-proxy-metric/0.log" Apr 17 16:54:21.040511 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:21.040478 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7cf048f7-68e4-4bc9-beac-730ca8f13ceb/prom-label-proxy/0.log" Apr 17 16:54:21.043718 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:21.043688 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f58f17c2-5caf-4540-bd65-8b76a71b4a07" path="/var/lib/kubelet/pods/f58f17c2-5caf-4540-bd65-8b76a71b4a07/volumes" Apr 17 16:54:21.066538 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:21.066504 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7cf048f7-68e4-4bc9-beac-730ca8f13ceb/init-config-reloader/0.log" Apr 17 16:54:21.125710 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:21.125662 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xw5t5_0b16f4d3-198e-4627-a661-0da1d8f90ee9/kube-state-metrics/0.log" Apr 17 16:54:21.147930 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:21.147892 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xw5t5_0b16f4d3-198e-4627-a661-0da1d8f90ee9/kube-rbac-proxy-main/0.log" Apr 17 16:54:21.172143 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:21.172114 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xw5t5_0b16f4d3-198e-4627-a661-0da1d8f90ee9/kube-rbac-proxy-self/0.log" Apr 17 16:54:21.419732 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:21.419697 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ssfqs_1eb03b6e-f205-4b1e-976b-27236f5e9e47/node-exporter/0.log" Apr 17 16:54:21.449961 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:21.449930 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ssfqs_1eb03b6e-f205-4b1e-976b-27236f5e9e47/kube-rbac-proxy/0.log" Apr 17 16:54:21.474169 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:21.474145 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ssfqs_1eb03b6e-f205-4b1e-976b-27236f5e9e47/init-textfile/0.log" Apr 17 16:54:21.500689 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:21.500648 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-svlf8_f4baaf28-021b-4c3a-bf16-ff044a443f99/kube-rbac-proxy-main/0.log" Apr 17 16:54:21.523242 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:21.523208 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-svlf8_f4baaf28-021b-4c3a-bf16-ff044a443f99/kube-rbac-proxy-self/0.log" Apr 17 16:54:21.552622 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:21.552595 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-svlf8_f4baaf28-021b-4c3a-bf16-ff044a443f99/openshift-state-metrics/0.log" Apr 17 16:54:21.594895 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:21.594864 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ac0c672e-04a0-4fbb-ae4d-8f52f14c74be/prometheus/0.log" Apr 17 16:54:21.621225 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:21.621197 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ac0c672e-04a0-4fbb-ae4d-8f52f14c74be/config-reloader/0.log" Apr 17 16:54:21.651502 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:21.651474 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ac0c672e-04a0-4fbb-ae4d-8f52f14c74be/thanos-sidecar/0.log" Apr 17 16:54:21.679750 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:21.679675 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ac0c672e-04a0-4fbb-ae4d-8f52f14c74be/kube-rbac-proxy-web/0.log" Apr 17 16:54:21.704840 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:21.704811 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ac0c672e-04a0-4fbb-ae4d-8f52f14c74be/kube-rbac-proxy/0.log" Apr 17 16:54:21.734036 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:21.734006 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ac0c672e-04a0-4fbb-ae4d-8f52f14c74be/kube-rbac-proxy-thanos/0.log" Apr 17 16:54:21.769605 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:21.769579 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ac0c672e-04a0-4fbb-ae4d-8f52f14c74be/init-config-reloader/0.log" Apr 17 16:54:21.812097 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:21.812057 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-7wqsv_538fa6d8-c9c5-4f08-b49c-55184d52040e/prometheus-operator/0.log" Apr 17 16:54:21.850112 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:21.850073 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-7wqsv_538fa6d8-c9c5-4f08-b49c-55184d52040e/kube-rbac-proxy/0.log" Apr 17 16:54:22.033623 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:22.033593 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-85f4f855f9-2dwwm_528a342b-53f1-4c6d-a19d-b68a3684d7d0/thanos-query/0.log" Apr 17 16:54:22.057014 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:22.056967 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-85f4f855f9-2dwwm_528a342b-53f1-4c6d-a19d-b68a3684d7d0/kube-rbac-proxy-web/0.log" Apr 17 16:54:22.079040 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:22.079013 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-85f4f855f9-2dwwm_528a342b-53f1-4c6d-a19d-b68a3684d7d0/kube-rbac-proxy/0.log" Apr 17 16:54:22.101302 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:22.101240 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-85f4f855f9-2dwwm_528a342b-53f1-4c6d-a19d-b68a3684d7d0/prom-label-proxy/0.log" Apr 17 16:54:22.126643 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:22.126612 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-85f4f855f9-2dwwm_528a342b-53f1-4c6d-a19d-b68a3684d7d0/kube-rbac-proxy-rules/0.log" Apr 17 16:54:22.158655 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:22.158616 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-85f4f855f9-2dwwm_528a342b-53f1-4c6d-a19d-b68a3684d7d0/kube-rbac-proxy-metrics/0.log" Apr 17 16:54:23.994633 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:23.994587 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-l7rfq/perf-node-gather-daemonset-ds2k8"] Apr 17 16:54:23.995105 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:23.995086 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f58f17c2-5caf-4540-bd65-8b76a71b4a07" containerName="gather" Apr 17 16:54:23.995150 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:23.995109 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f58f17c2-5caf-4540-bd65-8b76a71b4a07" containerName="gather" Apr 17 16:54:23.995150 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:23.995126 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f58f17c2-5caf-4540-bd65-8b76a71b4a07" containerName="copy" Apr 17 16:54:23.995150 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:23.995134 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f58f17c2-5caf-4540-bd65-8b76a71b4a07" containerName="copy" Apr 17 16:54:23.995245 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:23.995232 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f58f17c2-5caf-4540-bd65-8b76a71b4a07" containerName="copy" Apr 17 16:54:23.995296 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:23.995263 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f58f17c2-5caf-4540-bd65-8b76a71b4a07" containerName="gather" Apr 17 16:54:24.000134 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:24.000103 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-ds2k8" Apr 17 16:54:24.006758 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:24.006716 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-l7rfq/perf-node-gather-daemonset-ds2k8"] Apr 17 16:54:24.060997 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:24.060966 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/824e8c97-3b8b-4347-9128-0af594b3ea99-lib-modules\") pod \"perf-node-gather-daemonset-ds2k8\" (UID: \"824e8c97-3b8b-4347-9128-0af594b3ea99\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-ds2k8" Apr 17 16:54:24.061195 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:24.061092 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/824e8c97-3b8b-4347-9128-0af594b3ea99-sys\") pod \"perf-node-gather-daemonset-ds2k8\" (UID: \"824e8c97-3b8b-4347-9128-0af594b3ea99\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-ds2k8" Apr 17 16:54:24.061195 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:24.061161 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmszn\" (UniqueName: \"kubernetes.io/projected/824e8c97-3b8b-4347-9128-0af594b3ea99-kube-api-access-bmszn\") pod \"perf-node-gather-daemonset-ds2k8\" (UID: \"824e8c97-3b8b-4347-9128-0af594b3ea99\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-ds2k8" Apr 17 16:54:24.061195 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:24.061190 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/824e8c97-3b8b-4347-9128-0af594b3ea99-proc\") pod \"perf-node-gather-daemonset-ds2k8\" (UID: \"824e8c97-3b8b-4347-9128-0af594b3ea99\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-ds2k8" Apr 17 16:54:24.061388 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:24.061235 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/824e8c97-3b8b-4347-9128-0af594b3ea99-podres\") pod \"perf-node-gather-daemonset-ds2k8\" (UID: \"824e8c97-3b8b-4347-9128-0af594b3ea99\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-ds2k8" Apr 17 16:54:24.161762 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:24.161718 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/824e8c97-3b8b-4347-9128-0af594b3ea99-lib-modules\") pod \"perf-node-gather-daemonset-ds2k8\" (UID: \"824e8c97-3b8b-4347-9128-0af594b3ea99\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-ds2k8" Apr 17 16:54:24.161762 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:24.161758 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/824e8c97-3b8b-4347-9128-0af594b3ea99-sys\") pod \"perf-node-gather-daemonset-ds2k8\" (UID: \"824e8c97-3b8b-4347-9128-0af594b3ea99\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-ds2k8" Apr 17 16:54:24.162008 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:24.161801 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmszn\" (UniqueName: \"kubernetes.io/projected/824e8c97-3b8b-4347-9128-0af594b3ea99-kube-api-access-bmszn\") pod \"perf-node-gather-daemonset-ds2k8\" (UID: \"824e8c97-3b8b-4347-9128-0af594b3ea99\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-ds2k8" Apr 17 16:54:24.162008 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:24.161832 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/824e8c97-3b8b-4347-9128-0af594b3ea99-proc\") pod \"perf-node-gather-daemonset-ds2k8\" (UID: \"824e8c97-3b8b-4347-9128-0af594b3ea99\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-ds2k8" Apr 17 16:54:24.162008 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:24.161866 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/824e8c97-3b8b-4347-9128-0af594b3ea99-podres\") pod \"perf-node-gather-daemonset-ds2k8\" (UID: \"824e8c97-3b8b-4347-9128-0af594b3ea99\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-ds2k8" Apr 17 16:54:24.162008 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:24.161914 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/824e8c97-3b8b-4347-9128-0af594b3ea99-lib-modules\") pod \"perf-node-gather-daemonset-ds2k8\" (UID: \"824e8c97-3b8b-4347-9128-0af594b3ea99\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-ds2k8" Apr 17 16:54:24.162008 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:24.161927 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/824e8c97-3b8b-4347-9128-0af594b3ea99-sys\") pod \"perf-node-gather-daemonset-ds2k8\" (UID: \"824e8c97-3b8b-4347-9128-0af594b3ea99\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-ds2k8" Apr 17 16:54:24.162008 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:24.161958 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/824e8c97-3b8b-4347-9128-0af594b3ea99-proc\") pod \"perf-node-gather-daemonset-ds2k8\" (UID: \"824e8c97-3b8b-4347-9128-0af594b3ea99\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-ds2k8" Apr 17 16:54:24.162008 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:24.161978 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/824e8c97-3b8b-4347-9128-0af594b3ea99-podres\") pod \"perf-node-gather-daemonset-ds2k8\" (UID: \"824e8c97-3b8b-4347-9128-0af594b3ea99\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-ds2k8" Apr 17 16:54:24.171095 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:24.171060 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmszn\" (UniqueName: \"kubernetes.io/projected/824e8c97-3b8b-4347-9128-0af594b3ea99-kube-api-access-bmszn\") pod \"perf-node-gather-daemonset-ds2k8\" (UID: \"824e8c97-3b8b-4347-9128-0af594b3ea99\") " pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-ds2k8" Apr 17 16:54:24.316192 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:24.316116 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-ds2k8" Apr 17 16:54:24.457851 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:24.457827 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-l7rfq/perf-node-gather-daemonset-ds2k8"] Apr 17 16:54:25.041521 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:25.041494 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bhgll_995d1a32-09a5-4100-bd91-c6ccdf96086d/dns/0.log" Apr 17 16:54:25.061849 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:25.061824 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bhgll_995d1a32-09a5-4100-bd91-c6ccdf96086d/kube-rbac-proxy/0.log" Apr 17 16:54:25.128156 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:25.128125 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2wqxs_2be0d6b4-a7ac-45cf-80dc-e5427b8f1559/dns-node-resolver/0.log" Apr 17 16:54:25.406038 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:25.406003 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-ds2k8" event={"ID":"824e8c97-3b8b-4347-9128-0af594b3ea99","Type":"ContainerStarted","Data":"3c7ad9a1b5e297d7dff5ab105238f6fa06922f72be9c0cded559218a1e8af96d"} Apr 17 16:54:25.406329 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:25.406308 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-ds2k8" event={"ID":"824e8c97-3b8b-4347-9128-0af594b3ea99","Type":"ContainerStarted","Data":"5741d64bfe87bd2214527ca31a4a4a29f3a832edf253dd8b7b160ad54581cd28"} Apr 17 16:54:25.407409 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:25.407390 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-ds2k8" Apr 17 16:54:25.422692 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:25.422559 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-ds2k8" podStartSLOduration=2.422541694 podStartE2EDuration="2.422541694s" podCreationTimestamp="2026-04-17 16:54:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:54:25.421930251 +0000 UTC m=+1362.944955835" watchObservedRunningTime="2026-04-17 16:54:25.422541694 +0000 UTC m=+1362.945567278" Apr 17 16:54:25.600714 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:25.600680 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hch47_e4b20a08-0520-48d1-bce6-26fcc1371d10/node-ca/0.log" Apr 17 16:54:26.622013 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:26.621977 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4jlzg_28bd3d62-5065-4e51-a02d-686fb319fffc/serve-healthcheck-canary/0.log" Apr 17 16:54:27.147408 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:27.147379 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wbbbm_02278729-f9e5-4615-9be2-2f650d08b858/kube-rbac-proxy/0.log" Apr 17 16:54:27.166060 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:27.166029 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wbbbm_02278729-f9e5-4615-9be2-2f650d08b858/exporter/0.log" Apr 17 16:54:27.189774 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:27.189749 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wbbbm_02278729-f9e5-4615-9be2-2f650d08b858/extractor/0.log" Apr 17 16:54:29.291023 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:29.290985 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-6fmnk_267806f9-e950-4f00-80b9-35aa3861db64/manager/0.log" Apr 17 16:54:29.312064 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:29.312034 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-v5zkt_7dbc4a4d-ea9d-420f-8973-64cf3b1cb9fb/s3-init/0.log" Apr 17 16:54:32.424893 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:32.424159 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-l7rfq/perf-node-gather-daemonset-ds2k8" Apr 17 16:54:34.620760 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:34.620729 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5n4q5_6a215469-2ba6-4a12-bd40-a197844067ed/kube-multus/0.log" Apr 17 16:54:34.990195 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:34.990160 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k4c9b_15557662-26a5-4d16-b9d6-e301ff3e11c6/kube-multus-additional-cni-plugins/0.log" Apr 17 16:54:35.014240 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:35.014204 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k4c9b_15557662-26a5-4d16-b9d6-e301ff3e11c6/egress-router-binary-copy/0.log" Apr 17 16:54:35.036892 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:35.036861 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k4c9b_15557662-26a5-4d16-b9d6-e301ff3e11c6/cni-plugins/0.log" Apr 17 16:54:35.057909 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:35.057879 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k4c9b_15557662-26a5-4d16-b9d6-e301ff3e11c6/bond-cni-plugin/0.log" Apr 17 16:54:35.080871 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:35.080848 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k4c9b_15557662-26a5-4d16-b9d6-e301ff3e11c6/routeoverride-cni/0.log" Apr 17 16:54:35.102638 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:35.102610 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k4c9b_15557662-26a5-4d16-b9d6-e301ff3e11c6/whereabouts-cni-bincopy/0.log" Apr 17 16:54:35.122348 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:35.122323 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k4c9b_15557662-26a5-4d16-b9d6-e301ff3e11c6/whereabouts-cni/0.log" Apr 17 16:54:35.215870 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:35.215789 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vtq9t_4666c56f-3d86-4e16-a782-6a41f0fe8825/network-metrics-daemon/0.log" Apr 17 16:54:35.234723 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:35.234689 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vtq9t_4666c56f-3d86-4e16-a782-6a41f0fe8825/kube-rbac-proxy/0.log" Apr 17 16:54:36.021638 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:36.021609 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79ft9_97b20cee-5673-4b39-a3f9-105d0d794713/ovn-controller/0.log" Apr 17 16:54:36.052700 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:36.052667 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79ft9_97b20cee-5673-4b39-a3f9-105d0d794713/ovn-acl-logging/0.log" Apr 17 16:54:36.071529 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:36.071495 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79ft9_97b20cee-5673-4b39-a3f9-105d0d794713/kube-rbac-proxy-node/0.log" Apr 17 16:54:36.096477 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:36.096449 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79ft9_97b20cee-5673-4b39-a3f9-105d0d794713/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 16:54:36.120446 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:36.120418 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79ft9_97b20cee-5673-4b39-a3f9-105d0d794713/northd/0.log" Apr 17 16:54:36.147835 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:36.147808 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79ft9_97b20cee-5673-4b39-a3f9-105d0d794713/nbdb/0.log" Apr 17 16:54:36.169784 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:36.169755 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79ft9_97b20cee-5673-4b39-a3f9-105d0d794713/sbdb/0.log" Apr 17 16:54:36.351271 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:36.351166 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79ft9_97b20cee-5673-4b39-a3f9-105d0d794713/ovnkube-controller/0.log" Apr 17 16:54:37.856526 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:37.856497 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-hdwf7_290ef757-149c-497a-85e3-cc6a8cd8fc45/network-check-target-container/0.log" Apr 17 16:54:38.749001 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:38.748968 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-c4d5f_963aea58-ae9e-49da-b049-4fd51933dfd1/iptables-alerter/0.log" Apr 17 16:54:39.391903 ip-10-0-135-127 kubenswrapper[2569]: I0417 16:54:39.391871 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-7rxj5_deef8b97-d137-4d1d-b5bf-258429691ce3/tuned/0.log"