Apr 16 13:54:32.230317 ip-10-0-141-131 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 13:54:32.230332 ip-10-0-141-131 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 13:54:32.230339 ip-10-0-141-131 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 13:54:32.230621 ip-10-0-141-131 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 13:54:42.264669 ip-10-0-141-131 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 13:54:42.264687 ip-10-0-141-131 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 2e9ceab36cfe47d4bfdca36f876c64a8 -- Apr 16 13:57:05.707029 ip-10-0-141-131 systemd[1]: Starting Kubernetes Kubelet... Apr 16 13:57:06.145365 ip-10-0-141-131 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:57:06.145365 ip-10-0-141-131 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 13:57:06.145365 ip-10-0-141-131 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:57:06.145365 ip-10-0-141-131 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 13:57:06.145365 ip-10-0-141-131 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:57:06.146685 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.146593 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 13:57:06.148984 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.148969 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:57:06.148984 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.148984 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:57:06.149045 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.148987 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:57:06.149045 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.148993 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:57:06.149045 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.148997 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:57:06.149045 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149001 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:57:06.149045 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149004 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:57:06.149045 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149007 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:57:06.149045 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149010 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:57:06.149045 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149013 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:57:06.149045 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149017 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:57:06.149045 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149020 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:57:06.149045 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149023 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:57:06.149045 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149025 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:57:06.149045 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149029 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:57:06.149045 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149032 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:57:06.149045 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149034 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:57:06.149045 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149037 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:57:06.149045 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149040 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:57:06.149045 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149042 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:57:06.149045 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149045 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:57:06.149498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149048 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:57:06.149498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149051 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:57:06.149498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149054 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:57:06.149498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149057 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:57:06.149498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149060 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:57:06.149498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149063 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:57:06.149498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149066 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:57:06.149498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149068 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:57:06.149498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149071 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:57:06.149498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149074 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:57:06.149498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149078 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:57:06.149498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149081 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:57:06.149498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149084 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:57:06.149498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149087 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:57:06.149498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149089 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:57:06.149498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149092 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:57:06.149498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149094 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:57:06.149498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149097 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:57:06.149498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149100 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:57:06.149498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149102 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:57:06.150005 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149105 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:57:06.150005 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149107 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:57:06.150005 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149110 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:57:06.150005 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149113 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:57:06.150005 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149116 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:57:06.150005 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149119 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:57:06.150005 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149122 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:57:06.150005 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149124 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:57:06.150005 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149127 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:57:06.150005 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149129 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:57:06.150005 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149132 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:57:06.150005 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149135 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:57:06.150005 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149137 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:57:06.150005 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149141 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:57:06.150005 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149144 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:57:06.150005 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149147 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:57:06.150005 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149150 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:57:06.150005 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149153 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:57:06.150005 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149155 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:57:06.150005 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149158 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:57:06.150539 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149162 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:57:06.150539 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149164 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:57:06.150539 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149167 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:57:06.150539 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149169 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:57:06.150539 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149172 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:57:06.150539 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149175 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:57:06.150539 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149177 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:57:06.150539 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149180 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:57:06.150539 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149182 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:57:06.150539 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149185 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:57:06.150539 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149188 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:57:06.150539 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149190 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:57:06.150539 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149193 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:57:06.150539 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149195 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:57:06.150539 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149198 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:57:06.150539 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149200 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:57:06.150539 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149203 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:57:06.150539 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149205 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:57:06.150539 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149208 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:57:06.150539 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149210 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:57:06.151032 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149213 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:57:06.151032 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149216 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:57:06.151032 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149219 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:57:06.151032 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149221 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:57:06.151032 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149224 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:57:06.151032 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149625 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:57:06.151032 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149631 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:57:06.151032 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149634 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:57:06.151032 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149637 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:57:06.151032 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149639 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:57:06.151032 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149642 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:57:06.151032 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149645 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:57:06.151032 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149647 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:57:06.151032 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149650 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:57:06.151032 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149654 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:57:06.151032 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149657 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:57:06.151032 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149659 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:57:06.151032 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149663 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:57:06.151032 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149665 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:57:06.151498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149668 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:57:06.151498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149671 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:57:06.151498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149673 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:57:06.151498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149676 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:57:06.151498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149679 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:57:06.151498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149681 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:57:06.151498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149684 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:57:06.151498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149687 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:57:06.151498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149689 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:57:06.151498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149692 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:57:06.151498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149694 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:57:06.151498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149697 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:57:06.151498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149699 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:57:06.151498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149702 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:57:06.151498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149705 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:57:06.151498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149707 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:57:06.151498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149710 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:57:06.151498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149712 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:57:06.151498 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149715 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:57:06.152033 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149718 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:57:06.152033 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149734 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:57:06.152033 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149737 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:57:06.152033 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149739 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:57:06.152033 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149742 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:57:06.152033 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149745 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:57:06.152033 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149747 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:57:06.152033 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149750 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:57:06.152033 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149753 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:57:06.152033 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149756 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:57:06.152033 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149759 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:57:06.152033 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149762 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:57:06.152033 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149765 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:57:06.152033 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149768 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:57:06.152033 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149770 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:57:06.152033 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149773 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:57:06.152033 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149776 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:57:06.152033 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149779 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:57:06.152033 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149781 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:57:06.152033 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149785 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:57:06.152517 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149789 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:57:06.152517 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149791 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:57:06.152517 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149794 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:57:06.152517 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149797 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:57:06.152517 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149799 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:57:06.152517 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149802 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:57:06.152517 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149806 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:57:06.152517 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149809 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:57:06.152517 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149813 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:57:06.152517 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149815 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:57:06.152517 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149819 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:57:06.152517 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149822 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:57:06.152517 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149825 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:57:06.152517 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149828 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:57:06.152517 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149831 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:57:06.152517 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149833 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:57:06.152517 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149836 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:57:06.152517 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149839 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:57:06.152517 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149841 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:57:06.152990 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149843 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:57:06.152990 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149847 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:57:06.152990 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149851 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:57:06.152990 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149853 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:57:06.152990 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149856 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:57:06.152990 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149858 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:57:06.152990 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149862 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:57:06.152990 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149865 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:57:06.152990 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149867 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:57:06.152990 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149870 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:57:06.152990 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149872 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:57:06.152990 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149875 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:57:06.152990 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149878 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:57:06.152990 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.149880 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:57:06.152990 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150751 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 13:57:06.152990 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150760 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 13:57:06.152990 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150766 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 13:57:06.152990 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150771 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 13:57:06.152990 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150775 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 13:57:06.152990 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150779 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 13:57:06.152990 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150784 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 13:57:06.153494 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150789 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 13:57:06.153494 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150792 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 13:57:06.153494 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150796 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 13:57:06.153494 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150799 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 13:57:06.153494 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150803 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 13:57:06.153494 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150806 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 13:57:06.153494 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150809 2575 flags.go:64] FLAG: --cgroup-root="" Apr 16 13:57:06.153494 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150812 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 13:57:06.153494 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150815 2575 flags.go:64] FLAG: --client-ca-file="" Apr 16 13:57:06.153494 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150818 2575 flags.go:64] FLAG: --cloud-config="" Apr 16 13:57:06.153494 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150821 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 16 13:57:06.153494 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150823 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 13:57:06.153494 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150829 2575 flags.go:64] FLAG: --cluster-domain="" Apr 16 13:57:06.153494 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150832 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 13:57:06.153494 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150835 2575 flags.go:64] FLAG: --config-dir="" Apr 16 13:57:06.153494 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150838 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 13:57:06.153494 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150841 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 13:57:06.153494 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150846 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 13:57:06.153494 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150850 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 13:57:06.153494 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150853 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 13:57:06.153494 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150857 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 13:57:06.153494 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150860 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 16 13:57:06.153494 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150863 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 13:57:06.153494 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150866 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 13:57:06.154085 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150869 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 13:57:06.154085 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150872 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 13:57:06.154085 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150876 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 13:57:06.154085 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150880 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 13:57:06.154085 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150883 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 13:57:06.154085 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150886 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 13:57:06.154085 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150889 2575 flags.go:64] FLAG: --enable-server="true" Apr 16 13:57:06.154085 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150892 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 13:57:06.154085 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150901 2575 flags.go:64] FLAG: --event-burst="100" Apr 16 13:57:06.154085 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150904 2575 flags.go:64] FLAG: --event-qps="50" Apr 16 13:57:06.154085 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150908 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 13:57:06.154085 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150911 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 13:57:06.154085 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150914 2575 flags.go:64] FLAG: --eviction-hard="" Apr 16 13:57:06.154085 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150917 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 13:57:06.154085 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150920 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 13:57:06.154085 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150924 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 13:57:06.154085 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150927 2575 flags.go:64] FLAG: --eviction-soft="" Apr 16 13:57:06.154085 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150930 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 13:57:06.154085 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150932 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 13:57:06.154085 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150935 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 13:57:06.154085 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150939 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 13:57:06.154085 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150942 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 13:57:06.154085 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150945 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 13:57:06.154085 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150947 2575 flags.go:64] FLAG: --feature-gates="" Apr 16 13:57:06.154085 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150952 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 13:57:06.154686 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150955 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 13:57:06.154686 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150958 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 13:57:06.154686 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150961 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 13:57:06.154686 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150965 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 16 13:57:06.154686 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150968 2575 flags.go:64] FLAG: --help="false" Apr 16 13:57:06.154686 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150971 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-141-131.ec2.internal" Apr 16 13:57:06.154686 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150974 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 13:57:06.154686 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150977 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 13:57:06.154686 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150980 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 13:57:06.154686 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150983 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 13:57:06.154686 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150987 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 13:57:06.154686 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150990 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 13:57:06.154686 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150992 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 13:57:06.154686 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150995 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 13:57:06.154686 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.150998 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 13:57:06.154686 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151002 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 13:57:06.154686 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151006 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 13:57:06.154686 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151008 2575 flags.go:64] FLAG: --kube-reserved="" Apr 16 13:57:06.154686 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151011 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 13:57:06.154686 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151014 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 13:57:06.154686 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151018 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 13:57:06.154686 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151020 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 13:57:06.154686 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151023 2575 flags.go:64] FLAG: --lock-file="" Apr 16 13:57:06.154686 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151026 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 13:57:06.155301 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151029 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 13:57:06.155301 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151033 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 13:57:06.155301 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151039 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 13:57:06.155301 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151042 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 13:57:06.155301 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151045 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 13:57:06.155301 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151048 2575 flags.go:64] FLAG: --logging-format="text" Apr 16 13:57:06.155301 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151051 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 13:57:06.155301 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151054 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 13:57:06.155301 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151057 2575 flags.go:64] FLAG: --manifest-url="" Apr 16 13:57:06.155301 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151060 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 16 13:57:06.155301 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151065 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 13:57:06.155301 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151068 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 13:57:06.155301 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151073 2575 flags.go:64] FLAG: --max-pods="110" Apr 16 13:57:06.155301 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151076 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 13:57:06.155301 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151079 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 13:57:06.155301 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151082 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 13:57:06.155301 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151085 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 13:57:06.155301 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151088 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 13:57:06.155301 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151091 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 13:57:06.155301 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151094 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 13:57:06.155301 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151101 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 13:57:06.155301 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151104 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 13:57:06.155301 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151107 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 13:57:06.155301 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151111 2575 flags.go:64] FLAG: --pod-cidr="" Apr 16 13:57:06.155903 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151114 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 13:57:06.155903 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151119 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 13:57:06.155903 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151122 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 13:57:06.155903 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151125 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 16 13:57:06.155903 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151128 2575 flags.go:64] FLAG: --port="10250" Apr 16 13:57:06.155903 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151131 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 13:57:06.155903 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151134 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-04499ccc4eddc9e0b" Apr 16 13:57:06.155903 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151138 2575 flags.go:64] FLAG: --qos-reserved="" Apr 16 13:57:06.155903 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151141 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 16 13:57:06.155903 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151143 2575 flags.go:64] FLAG: --register-node="true" Apr 16 13:57:06.155903 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151147 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 16 13:57:06.155903 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151150 2575 flags.go:64] FLAG: --register-with-taints="" Apr 16 13:57:06.155903 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151154 2575 flags.go:64] FLAG: --registry-burst="10" Apr 16 13:57:06.155903 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151156 2575 flags.go:64] FLAG: --registry-qps="5" Apr 16 13:57:06.155903 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151159 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 16 13:57:06.155903 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151162 2575 flags.go:64] FLAG: --reserved-memory="" Apr 16 13:57:06.155903 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151166 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 13:57:06.155903 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151169 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 13:57:06.155903 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151172 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 13:57:06.155903 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151175 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 13:57:06.155903 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151178 2575 flags.go:64] FLAG: --runonce="false" Apr 16 13:57:06.155903 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151181 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 13:57:06.155903 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151184 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 13:57:06.155903 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151187 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 16 13:57:06.155903 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151190 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 13:57:06.156573 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151193 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 13:57:06.156573 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151196 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 13:57:06.156573 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151199 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 13:57:06.156573 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151202 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 13:57:06.156573 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151205 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 13:57:06.156573 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151208 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 13:57:06.156573 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151211 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 13:57:06.156573 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151214 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 13:57:06.156573 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151217 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 13:57:06.156573 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151220 2575 flags.go:64] FLAG: --system-cgroups="" Apr 16 13:57:06.156573 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151223 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 13:57:06.156573 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151228 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 13:57:06.156573 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151231 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 16 13:57:06.156573 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151235 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 13:57:06.156573 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151240 2575 flags.go:64] FLAG: --tls-min-version="" Apr 16 13:57:06.156573 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151243 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 13:57:06.156573 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151246 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 13:57:06.156573 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151249 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 13:57:06.156573 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151252 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 13:57:06.156573 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151255 2575 flags.go:64] FLAG: --v="2" Apr 16 13:57:06.156573 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151259 2575 flags.go:64] FLAG: --version="false" Apr 16 13:57:06.156573 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151263 2575 flags.go:64] FLAG: --vmodule="" Apr 16 13:57:06.156573 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151268 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 13:57:06.156573 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.151271 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 13:57:06.156573 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151375 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:57:06.157187 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151379 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:57:06.157187 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151382 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:57:06.157187 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151385 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:57:06.157187 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151388 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:57:06.157187 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151392 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:57:06.157187 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151397 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:57:06.157187 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151400 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:57:06.157187 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151403 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:57:06.157187 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151406 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:57:06.157187 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151408 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:57:06.157187 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151411 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:57:06.157187 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151414 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:57:06.157187 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151417 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:57:06.157187 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151420 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:57:06.157187 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151424 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:57:06.157187 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151428 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:57:06.157187 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151431 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:57:06.157187 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151434 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:57:06.157187 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151437 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:57:06.157798 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151440 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:57:06.157798 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151449 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:57:06.157798 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151452 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:57:06.157798 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151455 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:57:06.157798 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151458 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:57:06.157798 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151461 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:57:06.157798 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151464 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:57:06.157798 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151467 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:57:06.157798 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151469 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:57:06.157798 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151472 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:57:06.157798 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151475 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:57:06.157798 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151478 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:57:06.157798 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151481 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:57:06.157798 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151483 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:57:06.157798 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151486 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:57:06.157798 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151488 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:57:06.157798 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151491 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:57:06.157798 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151494 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:57:06.157798 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151497 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:57:06.157798 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151500 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:57:06.158665 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151502 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:57:06.158665 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151505 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:57:06.158665 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151508 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:57:06.158665 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151511 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:57:06.158665 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151513 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:57:06.158665 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151516 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:57:06.158665 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151519 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:57:06.158665 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151521 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:57:06.158665 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151524 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:57:06.158665 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151526 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:57:06.158665 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151529 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:57:06.158665 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151532 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:57:06.158665 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151535 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:57:06.158665 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151539 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:57:06.158665 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151541 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:57:06.158665 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151544 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:57:06.158665 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151546 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:57:06.158665 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151549 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:57:06.158665 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151551 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:57:06.158665 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151554 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:57:06.159543 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151556 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:57:06.159543 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151559 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:57:06.159543 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151561 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:57:06.159543 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151564 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:57:06.159543 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151566 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:57:06.159543 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151569 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:57:06.159543 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151571 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:57:06.159543 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151574 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:57:06.159543 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151576 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:57:06.159543 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151579 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:57:06.159543 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151582 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:57:06.159543 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151585 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:57:06.159543 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151587 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:57:06.159543 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151590 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:57:06.159543 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151592 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:57:06.159543 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151595 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:57:06.159543 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151597 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:57:06.159543 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151600 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:57:06.159543 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151602 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:57:06.160046 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151606 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:57:06.160046 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151609 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:57:06.160046 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151611 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:57:06.160046 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151614 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:57:06.160046 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151617 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:57:06.160046 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151619 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:57:06.160046 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.151624 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:57:06.160046 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.152590 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:57:06.160444 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.160422 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 13:57:06.160504 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.160448 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 13:57:06.160552 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160525 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:57:06.160552 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160535 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:57:06.160552 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160542 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:57:06.160552 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160546 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:57:06.160552 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160551 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:57:06.160802 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160556 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:57:06.160802 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160560 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:57:06.160802 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160565 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:57:06.160802 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160570 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:57:06.160802 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160575 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:57:06.160802 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160580 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:57:06.160802 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160584 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:57:06.160802 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160589 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:57:06.160802 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160593 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:57:06.160802 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160597 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:57:06.160802 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160601 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:57:06.160802 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160607 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:57:06.160802 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160615 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:57:06.160802 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160620 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:57:06.160802 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160625 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:57:06.160802 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160629 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:57:06.160802 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160633 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:57:06.160802 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160637 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:57:06.160802 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160641 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:57:06.160802 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160646 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:57:06.161655 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160650 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:57:06.161655 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160654 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:57:06.161655 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160659 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:57:06.161655 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160663 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:57:06.161655 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160668 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:57:06.161655 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160672 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:57:06.161655 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160678 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:57:06.161655 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160682 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:57:06.161655 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160686 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:57:06.161655 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160690 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:57:06.161655 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160694 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:57:06.161655 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160698 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:57:06.161655 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160703 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:57:06.161655 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160708 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:57:06.161655 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160713 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:57:06.161655 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160717 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:57:06.161655 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160738 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:57:06.161655 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160743 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:57:06.161655 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160747 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:57:06.162164 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160751 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:57:06.162164 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160756 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:57:06.162164 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160760 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:57:06.162164 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160764 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:57:06.162164 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160768 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:57:06.162164 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160772 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:57:06.162164 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160777 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:57:06.162164 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160780 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:57:06.162164 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160785 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:57:06.162164 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160789 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:57:06.162164 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160793 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:57:06.162164 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160797 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:57:06.162164 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160801 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:57:06.162164 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160807 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:57:06.162164 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160814 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:57:06.162164 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160819 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:57:06.162164 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160824 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:57:06.162164 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160829 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:57:06.162164 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160834 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:57:06.162704 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160839 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:57:06.162704 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160845 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:57:06.162704 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160850 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:57:06.162704 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160854 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:57:06.162704 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160858 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:57:06.162704 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160863 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:57:06.162704 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160867 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:57:06.162704 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160872 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:57:06.162704 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160877 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:57:06.162704 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160881 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:57:06.162704 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160885 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:57:06.162704 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160890 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:57:06.162704 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160894 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:57:06.162704 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160898 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:57:06.162704 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160902 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:57:06.162704 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160906 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:57:06.162704 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160910 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:57:06.162704 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160914 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:57:06.162704 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160918 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:57:06.162704 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160923 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:57:06.163466 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160927 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:57:06.163466 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160931 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:57:06.163466 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.160937 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:57:06.163466 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.160945 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:57:06.163466 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161109 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:57:06.163466 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161116 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:57:06.163466 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161121 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:57:06.163466 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161126 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:57:06.163466 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161130 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:57:06.163466 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161135 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:57:06.163466 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161139 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:57:06.163466 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161144 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:57:06.163466 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161148 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:57:06.163466 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161152 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:57:06.163466 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161158 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:57:06.163466 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161162 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:57:06.163917 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161167 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:57:06.163917 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161172 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:57:06.163917 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161177 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:57:06.163917 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161182 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:57:06.163917 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161187 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:57:06.163917 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161190 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:57:06.163917 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161195 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:57:06.163917 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161200 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:57:06.163917 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161204 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:57:06.163917 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161209 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:57:06.163917 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161213 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:57:06.163917 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161217 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:57:06.163917 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161221 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:57:06.163917 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161225 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:57:06.163917 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161230 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:57:06.163917 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161233 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:57:06.163917 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161238 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:57:06.163917 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161242 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:57:06.163917 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161247 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:57:06.163917 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161251 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:57:06.164428 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161255 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:57:06.164428 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161259 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:57:06.164428 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161263 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:57:06.164428 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161267 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:57:06.164428 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161271 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:57:06.164428 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161275 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:57:06.164428 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161279 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:57:06.164428 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161283 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:57:06.164428 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161287 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:57:06.164428 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161291 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:57:06.164428 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161295 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:57:06.164428 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161300 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:57:06.164428 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161304 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:57:06.164428 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161309 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:57:06.164428 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161313 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:57:06.164428 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161317 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:57:06.164428 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161321 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:57:06.164428 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161326 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:57:06.164428 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161330 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:57:06.164428 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161334 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:57:06.164952 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161339 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:57:06.164952 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161343 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:57:06.164952 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161347 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:57:06.164952 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161353 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:57:06.164952 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161359 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:57:06.164952 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161364 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:57:06.164952 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161368 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:57:06.164952 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161373 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:57:06.164952 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161377 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:57:06.164952 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161382 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:57:06.164952 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161386 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:57:06.164952 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161390 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:57:06.164952 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161394 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:57:06.164952 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161398 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:57:06.164952 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161402 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:57:06.164952 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161406 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:57:06.164952 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161410 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:57:06.164952 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161414 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:57:06.164952 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161418 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:57:06.165423 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161422 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:57:06.165423 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161426 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:57:06.165423 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161432 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:57:06.165423 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161438 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:57:06.165423 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161442 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:57:06.165423 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161447 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:57:06.165423 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161451 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:57:06.165423 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161455 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:57:06.165423 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161460 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:57:06.165423 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161464 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:57:06.165423 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161468 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:57:06.165423 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161472 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:57:06.165423 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161477 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:57:06.165423 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161481 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:57:06.165423 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:06.161485 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:57:06.166087 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.161494 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:57:06.166087 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.162278 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 13:57:06.167159 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.167142 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 13:57:06.168259 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.168246 2575 server.go:1019] "Starting client certificate rotation" Apr 16 13:57:06.168365 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.168346 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:57:06.168407 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.168382 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:57:06.192085 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.192051 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:57:06.197045 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.197021 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:57:06.213560 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.213536 2575 log.go:25] "Validated CRI v1 runtime API" Apr 16 13:57:06.219938 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.219918 2575 log.go:25] "Validated CRI v1 image API" Apr 16 13:57:06.220663 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.220645 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:57:06.222697 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.222681 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 13:57:06.224950 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.224930 2575 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 b005984d-587a-44c3-b2e8-bfa2d3ca8008:/dev/nvme0n1p4 c4557470-776b-4061-a7c0-2075461cbb7e:/dev/nvme0n1p3] Apr 16 13:57:06.225005 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.224951 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 13:57:06.231022 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.230911 2575 manager.go:217] Machine: {Timestamp:2026-04-16 13:57:06.229058709 +0000 UTC m=+0.407307567 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098599 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2c1769165dff468da0638d592c8160 SystemUUID:ec2c1769-165d-ff46-8da0-638d592c8160 BootID:2e9ceab3-6cfe-47d4-bfdc-a36f876c64a8 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:7f:22:7a:b0:85 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:7f:22:7a:b0:85 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:8e:0f:da:b5:61:f3 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 13:57:06.231580 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.231569 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 13:57:06.231683 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.231671 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 13:57:06.233965 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.233934 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 13:57:06.234119 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.233967 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-131.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 13:57:06.234169 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.234129 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 13:57:06.234169 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.234138 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 13:57:06.234169 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.234151 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:57:06.235211 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.235199 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:57:06.236349 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.236337 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:57:06.236626 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.236616 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 13:57:06.239438 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.239425 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 16 13:57:06.239438 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.239440 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 13:57:06.239522 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.239458 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 13:57:06.239522 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.239468 2575 kubelet.go:397] "Adding apiserver pod source" Apr 16 13:57:06.239522 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.239479 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 13:57:06.240523 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.240508 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:57:06.240523 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.240526 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:57:06.243801 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.243776 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 13:57:06.245600 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.245583 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 13:57:06.246869 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.246856 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 13:57:06.246927 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.246874 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 13:57:06.246927 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.246881 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 13:57:06.246927 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.246887 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 13:57:06.246927 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.246896 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 13:57:06.246927 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.246910 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 13:57:06.246927 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.246916 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 13:57:06.246927 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.246922 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 13:57:06.247118 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.246931 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 13:57:06.247118 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.246937 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 13:57:06.247118 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.246949 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 13:57:06.247118 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.246958 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 13:57:06.248237 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.248222 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 13:57:06.248271 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.248239 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 13:57:06.250937 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.250916 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-131.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 13:57:06.251805 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.251789 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 13:57:06.251861 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.251827 2575 server.go:1295] "Started kubelet" Apr 16 13:57:06.251931 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:06.251908 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-131.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 13:57:06.251985 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:06.251910 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 13:57:06.251985 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.251929 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 13:57:06.252080 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.251955 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 13:57:06.252080 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.252019 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 13:57:06.252583 ip-10-0-141-131 systemd[1]: Started Kubernetes Kubelet. Apr 16 13:57:06.253118 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.252993 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 13:57:06.253652 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.253641 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 16 13:57:06.257699 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.257678 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 13:57:06.258117 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.258099 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 13:57:06.259186 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.258647 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 13:57:06.259186 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.258674 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 13:57:06.259186 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.258689 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 13:57:06.259186 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.258842 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 16 13:57:06.259186 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.258849 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 16 13:57:06.259186 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.258877 2575 factory.go:55] Registering systemd factory Apr 16 13:57:06.259186 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.258894 2575 factory.go:223] Registration of the systemd container factory successfully Apr 16 13:57:06.259186 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.259188 2575 factory.go:153] Registering CRI-O factory Apr 16 13:57:06.259575 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.259200 2575 factory.go:223] Registration of the crio container factory successfully Apr 16 13:57:06.259575 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.259260 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 13:57:06.259575 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:06.259265 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-131.ec2.internal\" not found" Apr 16 13:57:06.259575 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.259282 2575 factory.go:103] Registering Raw factory Apr 16 13:57:06.259575 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.259299 2575 manager.go:1196] Started watching for new ooms in manager Apr 16 13:57:06.259847 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.259713 2575 manager.go:319] Starting recovery of all containers Apr 16 13:57:06.264305 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.264278 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-n77xw" Apr 16 13:57:06.264588 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:06.264560 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 13:57:06.264653 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:06.264581 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-141-131.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 13:57:06.265561 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:06.264669 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-131.ec2.internal.18a6daefbd46ac19 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-131.ec2.internal,UID:ip-10-0-141-131.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-141-131.ec2.internal,},FirstTimestamp:2026-04-16 13:57:06.251803673 +0000 UTC m=+0.430052532,LastTimestamp:2026-04-16 13:57:06.251803673 +0000 UTC m=+0.430052532,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-131.ec2.internal,}" Apr 16 13:57:06.271934 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.271776 2575 manager.go:324] Recovery completed Apr 16 13:57:06.272309 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.272281 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-n77xw" Apr 16 13:57:06.278196 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.278181 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:57:06.280692 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.280675 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-131.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:57:06.280780 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.280704 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-131.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:57:06.280780 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.280715 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-131.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:57:06.281246 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.281232 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 13:57:06.281246 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.281247 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 13:57:06.281364 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.281288 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:57:06.282956 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:06.282894 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-131.ec2.internal.18a6daefbeff7734 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-131.ec2.internal,UID:ip-10-0-141-131.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-141-131.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-141-131.ec2.internal,},FirstTimestamp:2026-04-16 13:57:06.280691508 +0000 UTC m=+0.458940367,LastTimestamp:2026-04-16 13:57:06.280691508 +0000 UTC m=+0.458940367,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-131.ec2.internal,}" Apr 16 13:57:06.283131 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.283119 2575 policy_none.go:49] "None policy: Start" Apr 16 13:57:06.283186 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.283138 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 13:57:06.283186 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.283152 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 16 13:57:06.320344 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.320318 2575 manager.go:341] "Starting Device Plugin manager" Apr 16 13:57:06.343369 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:06.320363 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 13:57:06.343369 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.320378 2575 server.go:85] "Starting device plugin registration server" Apr 16 13:57:06.343369 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.320629 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 13:57:06.343369 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.320641 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 13:57:06.343369 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.320754 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 13:57:06.343369 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.320832 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 13:57:06.343369 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.320840 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 13:57:06.343369 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:06.321264 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 13:57:06.343369 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:06.321300 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-131.ec2.internal\" not found" Apr 16 13:57:06.384481 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.384441 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 13:57:06.385602 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.385586 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 13:57:06.385669 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.385621 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 13:57:06.385669 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.385642 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 13:57:06.385669 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.385649 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 13:57:06.385824 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:06.385682 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 13:57:06.388912 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.388892 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:57:06.420836 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.420773 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:57:06.421918 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.421901 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-131.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:57:06.422003 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.421932 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-131.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:57:06.422003 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.421942 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-131.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:57:06.422003 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.421964 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-131.ec2.internal" Apr 16 13:57:06.428248 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.428229 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-131.ec2.internal" Apr 16 13:57:06.428358 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:06.428255 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-131.ec2.internal\": node \"ip-10-0-141-131.ec2.internal\" not found" Apr 16 13:57:06.448864 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:06.448838 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-131.ec2.internal\" not found" Apr 16 13:57:06.486760 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.486698 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-141-131.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-131.ec2.internal"] Apr 16 13:57:06.486923 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.486831 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:57:06.487826 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.487805 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-131.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:57:06.487946 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.487840 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-131.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:57:06.487946 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.487853 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-131.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:57:06.489142 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.489126 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:57:06.489263 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.489252 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-131.ec2.internal" Apr 16 13:57:06.489308 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.489278 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:57:06.489829 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.489814 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-131.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:57:06.489829 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.489816 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-131.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:57:06.489953 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.489844 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-131.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:57:06.489953 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.489850 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-131.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:57:06.489953 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.489858 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-131.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:57:06.489953 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.489860 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-131.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:57:06.491589 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.491571 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-131.ec2.internal" Apr 16 13:57:06.491674 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.491603 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:57:06.497893 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.497873 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-131.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:57:06.498006 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.497905 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-131.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:57:06.498006 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.497919 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-131.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:57:06.505438 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:06.505416 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-131.ec2.internal\" not found" node="ip-10-0-141-131.ec2.internal" Apr 16 13:57:06.511632 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:06.511611 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-131.ec2.internal\" not found" node="ip-10-0-141-131.ec2.internal" Apr 16 13:57:06.549603 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:06.549568 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-131.ec2.internal\" not found" Apr 16 13:57:06.560474 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.560448 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d8d2889d114a061679832b8c70f242a6-config\") pod \"kube-apiserver-proxy-ip-10-0-141-131.ec2.internal\" (UID: \"d8d2889d114a061679832b8c70f242a6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-131.ec2.internal" Apr 16 13:57:06.560474 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.560476 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8bcc61cd4afd68fd131f750c02f2018e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-131.ec2.internal\" (UID: \"8bcc61cd4afd68fd131f750c02f2018e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-131.ec2.internal" Apr 16 13:57:06.560663 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.560497 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8bcc61cd4afd68fd131f750c02f2018e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-131.ec2.internal\" (UID: \"8bcc61cd4afd68fd131f750c02f2018e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-131.ec2.internal" Apr 16 13:57:06.650312 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:06.650274 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-131.ec2.internal\" not found" Apr 16 13:57:06.660649 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.660626 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d8d2889d114a061679832b8c70f242a6-config\") pod \"kube-apiserver-proxy-ip-10-0-141-131.ec2.internal\" (UID: \"d8d2889d114a061679832b8c70f242a6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-131.ec2.internal" Apr 16 13:57:06.660801 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.660661 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8bcc61cd4afd68fd131f750c02f2018e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-131.ec2.internal\" (UID: \"8bcc61cd4afd68fd131f750c02f2018e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-131.ec2.internal" Apr 16 13:57:06.660801 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.660696 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8bcc61cd4afd68fd131f750c02f2018e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-131.ec2.internal\" (UID: \"8bcc61cd4afd68fd131f750c02f2018e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-131.ec2.internal" Apr 16 13:57:06.660801 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.660760 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8bcc61cd4afd68fd131f750c02f2018e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-131.ec2.internal\" (UID: \"8bcc61cd4afd68fd131f750c02f2018e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-131.ec2.internal" Apr 16 13:57:06.660801 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.660760 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8bcc61cd4afd68fd131f750c02f2018e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-131.ec2.internal\" (UID: \"8bcc61cd4afd68fd131f750c02f2018e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-131.ec2.internal" Apr 16 13:57:06.660942 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.660760 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d8d2889d114a061679832b8c70f242a6-config\") pod \"kube-apiserver-proxy-ip-10-0-141-131.ec2.internal\" (UID: \"d8d2889d114a061679832b8c70f242a6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-131.ec2.internal" Apr 16 13:57:06.751088 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:06.751010 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-131.ec2.internal\" not found" Apr 16 13:57:06.808476 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.808446 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-131.ec2.internal" Apr 16 13:57:06.814962 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:06.814939 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-131.ec2.internal" Apr 16 13:57:06.851362 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:06.851335 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-131.ec2.internal\" not found" Apr 16 13:57:06.951976 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:06.951930 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-131.ec2.internal\" not found" Apr 16 13:57:07.052442 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:07.052368 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-131.ec2.internal\" not found" Apr 16 13:57:07.152594 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:07.152570 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-131.ec2.internal\" not found" Apr 16 13:57:07.155814 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.155797 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:57:07.168208 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.168193 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 13:57:07.168607 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.168590 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:57:07.168653 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.168595 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:57:07.194971 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.194949 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:57:07.239814 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.239777 2575 apiserver.go:52] "Watching apiserver" Apr 16 13:57:07.246529 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.246506 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 13:57:07.249227 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.248555 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-hvqjp","openshift-image-registry/node-ca-89jl9","openshift-network-operator/iptables-alerter-5kvpq","openshift-multus/multus-additional-cni-plugins-xjl2f","openshift-multus/multus-mmczn","openshift-multus/network-metrics-daemon-crlsp","openshift-network-diagnostics/network-check-target-96bs9","openshift-ovn-kubernetes/ovnkube-node-qv8tl","kube-system/konnectivity-agent-cr2k7","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg","openshift-cluster-node-tuning-operator/tuned-xfspm"] Apr 16 13:57:07.251922 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.251906 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hvqjp" Apr 16 13:57:07.253122 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.253101 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-89jl9" Apr 16 13:57:07.253219 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.253178 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5kvpq" Apr 16 13:57:07.254191 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.254173 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 13:57:07.254285 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.254176 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 13:57:07.254285 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.254217 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-9c8gz\"" Apr 16 13:57:07.255218 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.255191 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 13:57:07.255324 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.255263 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xjl2f" Apr 16 13:57:07.255767 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.255749 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 13:57:07.255860 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.255846 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:57:07.256229 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.256217 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.256328 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.256316 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:07.256405 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:07.256385 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crlsp" podUID="77d59171-3e29-4e55-a4d9-a076a67a50ce" Apr 16 13:57:07.256823 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.256769 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-7vmbj\"" Apr 16 13:57:07.256823 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.256789 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 13:57:07.256823 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.256808 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 13:57:07.256979 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.256873 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-5s4x2\"" Apr 16 13:57:07.257038 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.257026 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 13:57:07.257336 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.257320 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:57:07.257400 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:07.257381 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-96bs9" podUID="b45111e6-3682-445c-ac82-d2870a0cac78" Apr 16 13:57:07.257664 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.257649 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 13:57:07.257798 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.257786 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 13:57:07.257847 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.257817 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 13:57:07.258006 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.257992 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 13:57:07.258073 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.258009 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 13:57:07.258073 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.258014 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 13:57:07.258150 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.258013 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-fjrgw\"" Apr 16 13:57:07.258354 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.258337 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 13:57:07.258414 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.258393 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-fb74c\"" Apr 16 13:57:07.258486 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.258470 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-131.ec2.internal" Apr 16 13:57:07.258602 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.258547 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.259538 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.259523 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-cr2k7" Apr 16 13:57:07.260547 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.260532 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 13:57:07.260649 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.260533 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 13:57:07.260649 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.260534 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-k9nz4\"" Apr 16 13:57:07.260826 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.260812 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" Apr 16 13:57:07.260892 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.260863 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 13:57:07.260961 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.260945 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 13:57:07.261015 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.260954 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 13:57:07.261062 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.261016 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 13:57:07.261831 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.261815 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.262167 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.262153 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-dbr77\"" Apr 16 13:57:07.262244 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.262228 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 13:57:07.262369 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.262357 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 13:57:07.262746 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.262719 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-2jf46\"" Apr 16 13:57:07.262867 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.262855 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 13:57:07.263007 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.262996 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 13:57:07.263007 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263001 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 13:57:07.263184 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263168 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/85abdd4c-8c23-4bf2-b9a2-a5e83b75807a-cni-binary-copy\") pod \"multus-additional-cni-plugins-xjl2f\" (UID: \"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a\") " pod="openshift-multus/multus-additional-cni-plugins-xjl2f" Apr 16 13:57:07.263246 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263200 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9xzl\" (UniqueName: \"kubernetes.io/projected/85abdd4c-8c23-4bf2-b9a2-a5e83b75807a-kube-api-access-m9xzl\") pod \"multus-additional-cni-plugins-xjl2f\" (UID: \"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a\") " pod="openshift-multus/multus-additional-cni-plugins-xjl2f" Apr 16 13:57:07.263246 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263229 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-multus-socket-dir-parent\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.263324 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263254 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-hostroot\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.263324 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263289 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-node-log\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.263380 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263337 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc2nn\" (UniqueName: \"kubernetes.io/projected/cb697396-e88e-4780-9f6a-2109bfc21e0f-kube-api-access-qc2nn\") pod \"node-ca-89jl9\" (UID: \"cb697396-e88e-4780-9f6a-2109bfc21e0f\") " pod="openshift-image-registry/node-ca-89jl9" Apr 16 13:57:07.263380 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263361 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-host-run-k8s-cni-cncf-io\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.263438 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263384 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-host-run-multus-certs\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.263438 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263408 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbvt8\" (UniqueName: \"kubernetes.io/projected/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-kube-api-access-cbvt8\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.263438 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263427 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-host-run-netns\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.263525 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263453 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs\") pod \"network-metrics-daemon-crlsp\" (UID: \"77d59171-3e29-4e55-a4d9-a076a67a50ce\") " pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:07.263525 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263472 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/85abdd4c-8c23-4bf2-b9a2-a5e83b75807a-os-release\") pod \"multus-additional-cni-plugins-xjl2f\" (UID: \"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a\") " pod="openshift-multus/multus-additional-cni-plugins-xjl2f" Apr 16 13:57:07.263525 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263487 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-system-cni-dir\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.263525 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263501 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/fa435174-0b32-4191-b42c-ad32bd3bc5db-konnectivity-ca\") pod \"konnectivity-agent-cr2k7\" (UID: \"fa435174-0b32-4191-b42c-ad32bd3bc5db\") " pod="kube-system/konnectivity-agent-cr2k7" Apr 16 13:57:07.263663 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263525 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-var-lib-openvswitch\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.263663 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263547 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/04113a02-0dc7-42c8-a11b-4684fb794c4f-ovnkube-config\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.263663 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263563 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-multus-cni-dir\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.263663 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263578 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-cnibin\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.263663 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263593 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-systemd-units\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.263663 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263607 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-run-systemd\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.263663 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263633 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.263663 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263649 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/fa435174-0b32-4191-b42c-ad32bd3bc5db-agent-certs\") pod \"konnectivity-agent-cr2k7\" (UID: \"fa435174-0b32-4191-b42c-ad32bd3bc5db\") " pod="kube-system/konnectivity-agent-cr2k7" Apr 16 13:57:07.263663 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263661 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-log-socket\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.264071 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263676 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1b680da6-ab85-4c31-98d8-35be4b07624b-tmp-dir\") pod \"node-resolver-hvqjp\" (UID: \"1b680da6-ab85-4c31-98d8-35be4b07624b\") " pod="openshift-dns/node-resolver-hvqjp" Apr 16 13:57:07.264071 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263693 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/85abdd4c-8c23-4bf2-b9a2-a5e83b75807a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xjl2f\" (UID: \"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a\") " pod="openshift-multus/multus-additional-cni-plugins-xjl2f" Apr 16 13:57:07.264071 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263741 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m4z9\" (UniqueName: \"kubernetes.io/projected/77d59171-3e29-4e55-a4d9-a076a67a50ce-kube-api-access-5m4z9\") pod \"network-metrics-daemon-crlsp\" (UID: \"77d59171-3e29-4e55-a4d9-a076a67a50ce\") " pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:07.264071 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263779 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-etc-openvswitch\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.264071 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263806 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/04113a02-0dc7-42c8-a11b-4684fb794c4f-env-overrides\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.264071 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263831 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/85abdd4c-8c23-4bf2-b9a2-a5e83b75807a-system-cni-dir\") pod \"multus-additional-cni-plugins-xjl2f\" (UID: \"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a\") " pod="openshift-multus/multus-additional-cni-plugins-xjl2f" Apr 16 13:57:07.264071 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263851 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:57:07.264071 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263851 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-rjz29\"" Apr 16 13:57:07.264071 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263885 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-host-kubelet\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.264071 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263916 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/04113a02-0dc7-42c8-a11b-4684fb794c4f-ovn-node-metrics-cert\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.264071 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263958 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/85abdd4c-8c23-4bf2-b9a2-a5e83b75807a-cnibin\") pod \"multus-additional-cni-plugins-xjl2f\" (UID: \"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a\") " pod="openshift-multus/multus-additional-cni-plugins-xjl2f" Apr 16 13:57:07.264071 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.263987 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-cni-binary-copy\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.264071 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.264022 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjl4q\" (UniqueName: \"kubernetes.io/projected/b45111e6-3682-445c-ac82-d2870a0cac78-kube-api-access-fjl4q\") pod \"network-check-target-96bs9\" (UID: \"b45111e6-3682-445c-ac82-d2870a0cac78\") " pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:57:07.264071 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.264045 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84xv2\" (UniqueName: \"kubernetes.io/projected/04113a02-0dc7-42c8-a11b-4684fb794c4f-kube-api-access-84xv2\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.264071 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.264061 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-host-slash\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.264593 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.264094 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-run-ovn\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.264593 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.264117 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1b680da6-ab85-4c31-98d8-35be4b07624b-hosts-file\") pod \"node-resolver-hvqjp\" (UID: \"1b680da6-ab85-4c31-98d8-35be4b07624b\") " pod="openshift-dns/node-resolver-hvqjp" Apr 16 13:57:07.264593 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.264141 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 13:57:07.264593 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.264157 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s4lp\" (UniqueName: \"kubernetes.io/projected/1b680da6-ab85-4c31-98d8-35be4b07624b-kube-api-access-5s4lp\") pod \"node-resolver-hvqjp\" (UID: \"1b680da6-ab85-4c31-98d8-35be4b07624b\") " pod="openshift-dns/node-resolver-hvqjp" Apr 16 13:57:07.264593 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.264173 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/85abdd4c-8c23-4bf2-b9a2-a5e83b75807a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xjl2f\" (UID: \"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a\") " pod="openshift-multus/multus-additional-cni-plugins-xjl2f" Apr 16 13:57:07.264593 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.264188 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-host-var-lib-cni-multus\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.264593 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.264202 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-host-cni-netd\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.264593 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.264229 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cb697396-e88e-4780-9f6a-2109bfc21e0f-serviceca\") pod \"node-ca-89jl9\" (UID: \"cb697396-e88e-4780-9f6a-2109bfc21e0f\") " pod="openshift-image-registry/node-ca-89jl9" Apr 16 13:57:07.264593 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.264269 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0a0fcf8a-7d68-4b75-b145-75ba1622662d-iptables-alerter-script\") pod \"iptables-alerter-5kvpq\" (UID: \"0a0fcf8a-7d68-4b75-b145-75ba1622662d\") " pod="openshift-network-operator/iptables-alerter-5kvpq" Apr 16 13:57:07.264593 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.264294 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0a0fcf8a-7d68-4b75-b145-75ba1622662d-host-slash\") pod \"iptables-alerter-5kvpq\" (UID: \"0a0fcf8a-7d68-4b75-b145-75ba1622662d\") " pod="openshift-network-operator/iptables-alerter-5kvpq" Apr 16 13:57:07.264593 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.264310 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-os-release\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.264593 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.264335 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-multus-daemon-config\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.264593 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.264358 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-host-run-netns\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.264593 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.264393 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-host-var-lib-cni-bin\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.264593 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.264408 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-etc-kubernetes\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.264593 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.264422 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-host-run-ovn-kubernetes\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.264593 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.264441 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cb697396-e88e-4780-9f6a-2109bfc21e0f-host\") pod \"node-ca-89jl9\" (UID: \"cb697396-e88e-4780-9f6a-2109bfc21e0f\") " pod="openshift-image-registry/node-ca-89jl9" Apr 16 13:57:07.265118 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.264464 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgp8f\" (UniqueName: \"kubernetes.io/projected/0a0fcf8a-7d68-4b75-b145-75ba1622662d-kube-api-access-tgp8f\") pod \"iptables-alerter-5kvpq\" (UID: \"0a0fcf8a-7d68-4b75-b145-75ba1622662d\") " pod="openshift-network-operator/iptables-alerter-5kvpq" Apr 16 13:57:07.265118 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.264480 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/85abdd4c-8c23-4bf2-b9a2-a5e83b75807a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xjl2f\" (UID: \"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a\") " pod="openshift-multus/multus-additional-cni-plugins-xjl2f" Apr 16 13:57:07.265118 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.264494 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-host-var-lib-kubelet\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.265118 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.264534 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-multus-conf-dir\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.265118 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.264566 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-run-openvswitch\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.265118 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.264602 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-host-cni-bin\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.265118 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.264619 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/04113a02-0dc7-42c8-a11b-4684fb794c4f-ovnkube-script-lib\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.269087 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.269070 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:57:07.274242 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.273876 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 13:52:06 +0000 UTC" deadline="2027-12-13 21:14:01.239845336 +0000 UTC" Apr 16 13:57:07.274242 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.273925 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14551h16m53.965924396s" Apr 16 13:57:07.274496 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.274479 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-131.ec2.internal"] Apr 16 13:57:07.274567 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.274507 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:57:07.274929 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.274915 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-131.ec2.internal" Apr 16 13:57:07.282696 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.282677 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-141-131.ec2.internal"] Apr 16 13:57:07.282911 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.282897 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:57:07.295102 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.295082 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-dkrf6" Apr 16 13:57:07.303677 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.303627 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-dkrf6" Apr 16 13:57:07.359197 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.359172 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 13:57:07.365332 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365309 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-run-ovn\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.365463 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365344 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1b680da6-ab85-4c31-98d8-35be4b07624b-hosts-file\") pod \"node-resolver-hvqjp\" (UID: \"1b680da6-ab85-4c31-98d8-35be4b07624b\") " pod="openshift-dns/node-resolver-hvqjp" Apr 16 13:57:07.365463 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365368 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5s4lp\" (UniqueName: \"kubernetes.io/projected/1b680da6-ab85-4c31-98d8-35be4b07624b-kube-api-access-5s4lp\") pod \"node-resolver-hvqjp\" (UID: \"1b680da6-ab85-4c31-98d8-35be4b07624b\") " pod="openshift-dns/node-resolver-hvqjp" Apr 16 13:57:07.365463 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365393 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/85abdd4c-8c23-4bf2-b9a2-a5e83b75807a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xjl2f\" (UID: \"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a\") " pod="openshift-multus/multus-additional-cni-plugins-xjl2f" Apr 16 13:57:07.365463 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365417 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-host-var-lib-cni-multus\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.365463 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365440 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-host-cni-netd\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.365463 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365462 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cb697396-e88e-4780-9f6a-2109bfc21e0f-serviceca\") pod \"node-ca-89jl9\" (UID: \"cb697396-e88e-4780-9f6a-2109bfc21e0f\") " pod="openshift-image-registry/node-ca-89jl9" Apr 16 13:57:07.365777 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365488 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1b680da6-ab85-4c31-98d8-35be4b07624b-hosts-file\") pod \"node-resolver-hvqjp\" (UID: \"1b680da6-ab85-4c31-98d8-35be4b07624b\") " pod="openshift-dns/node-resolver-hvqjp" Apr 16 13:57:07.365777 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365497 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-host-var-lib-cni-multus\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.365777 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365487 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0a0fcf8a-7d68-4b75-b145-75ba1622662d-iptables-alerter-script\") pod \"iptables-alerter-5kvpq\" (UID: \"0a0fcf8a-7d68-4b75-b145-75ba1622662d\") " pod="openshift-network-operator/iptables-alerter-5kvpq" Apr 16 13:57:07.365777 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365439 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-run-ovn\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.365777 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365531 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-host-cni-netd\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.365777 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365550 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0a0fcf8a-7d68-4b75-b145-75ba1622662d-host-slash\") pod \"iptables-alerter-5kvpq\" (UID: \"0a0fcf8a-7d68-4b75-b145-75ba1622662d\") " pod="openshift-network-operator/iptables-alerter-5kvpq" Apr 16 13:57:07.365777 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365587 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-os-release\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.365777 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365615 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-multus-daemon-config\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.365777 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365640 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-host-run-netns\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.365777 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365694 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-host-var-lib-cni-bin\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.365777 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365719 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-etc-kubernetes\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.365777 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365768 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/87cef1d3-c711-4f53-a775-8b55ec1bcf86-tmp\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.365777 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365774 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-host-run-netns\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.365777 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365779 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-os-release\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.366407 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365794 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-host-run-ovn-kubernetes\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.366407 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365820 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cb697396-e88e-4780-9f6a-2109bfc21e0f-host\") pod \"node-ca-89jl9\" (UID: \"cb697396-e88e-4780-9f6a-2109bfc21e0f\") " pod="openshift-image-registry/node-ca-89jl9" Apr 16 13:57:07.366407 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365824 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0a0fcf8a-7d68-4b75-b145-75ba1622662d-host-slash\") pod \"iptables-alerter-5kvpq\" (UID: \"0a0fcf8a-7d68-4b75-b145-75ba1622662d\") " pod="openshift-network-operator/iptables-alerter-5kvpq" Apr 16 13:57:07.366407 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365853 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tgp8f\" (UniqueName: \"kubernetes.io/projected/0a0fcf8a-7d68-4b75-b145-75ba1622662d-kube-api-access-tgp8f\") pod \"iptables-alerter-5kvpq\" (UID: \"0a0fcf8a-7d68-4b75-b145-75ba1622662d\") " pod="openshift-network-operator/iptables-alerter-5kvpq" Apr 16 13:57:07.366407 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365865 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cb697396-e88e-4780-9f6a-2109bfc21e0f-host\") pod \"node-ca-89jl9\" (UID: \"cb697396-e88e-4780-9f6a-2109bfc21e0f\") " pod="openshift-image-registry/node-ca-89jl9" Apr 16 13:57:07.366407 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365880 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/85abdd4c-8c23-4bf2-b9a2-a5e83b75807a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xjl2f\" (UID: \"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a\") " pod="openshift-multus/multus-additional-cni-plugins-xjl2f" Apr 16 13:57:07.366407 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365906 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-host-var-lib-cni-bin\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.366407 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365937 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-host-var-lib-kubelet\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.366407 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366142 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0a0fcf8a-7d68-4b75-b145-75ba1622662d-iptables-alerter-script\") pod \"iptables-alerter-5kvpq\" (UID: \"0a0fcf8a-7d68-4b75-b145-75ba1622662d\") " pod="openshift-network-operator/iptables-alerter-5kvpq" Apr 16 13:57:07.366407 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366161 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/85abdd4c-8c23-4bf2-b9a2-a5e83b75807a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xjl2f\" (UID: \"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a\") " pod="openshift-multus/multus-additional-cni-plugins-xjl2f" Apr 16 13:57:07.366407 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366178 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-multus-daemon-config\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.366407 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.365906 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-host-var-lib-kubelet\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.366407 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366207 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-host-run-ovn-kubernetes\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.366407 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366215 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-etc-kubernetes\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.366407 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366234 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-multus-conf-dir\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.366407 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366269 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb744\" (UniqueName: \"kubernetes.io/projected/7156eacb-925d-44f8-8084-e8c54e35371b-kube-api-access-wb744\") pod \"aws-ebs-csi-driver-node-hl7jg\" (UID: \"7156eacb-925d-44f8-8084-e8c54e35371b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" Apr 16 13:57:07.366407 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366290 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-multus-conf-dir\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.367289 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366300 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-run-openvswitch\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.367289 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366326 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-host-cni-bin\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.367289 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366351 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/04113a02-0dc7-42c8-a11b-4684fb794c4f-ovnkube-script-lib\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.367289 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366377 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/85abdd4c-8c23-4bf2-b9a2-a5e83b75807a-cni-binary-copy\") pod \"multus-additional-cni-plugins-xjl2f\" (UID: \"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a\") " pod="openshift-multus/multus-additional-cni-plugins-xjl2f" Apr 16 13:57:07.367289 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366404 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m9xzl\" (UniqueName: \"kubernetes.io/projected/85abdd4c-8c23-4bf2-b9a2-a5e83b75807a-kube-api-access-m9xzl\") pod \"multus-additional-cni-plugins-xjl2f\" (UID: \"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a\") " pod="openshift-multus/multus-additional-cni-plugins-xjl2f" Apr 16 13:57:07.367289 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366411 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-host-cni-bin\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.367289 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366461 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-run-openvswitch\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.367289 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366479 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-multus-socket-dir-parent\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.367289 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366705 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-hostroot\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.367289 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366755 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-etc-systemd\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.367289 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366809 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-hostroot\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.367289 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-node-log\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.367289 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366854 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cb697396-e88e-4780-9f6a-2109bfc21e0f-serviceca\") pod \"node-ca-89jl9\" (UID: \"cb697396-e88e-4780-9f6a-2109bfc21e0f\") " pod="openshift-image-registry/node-ca-89jl9" Apr 16 13:57:07.367289 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366866 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qc2nn\" (UniqueName: \"kubernetes.io/projected/cb697396-e88e-4780-9f6a-2109bfc21e0f-kube-api-access-qc2nn\") pod \"node-ca-89jl9\" (UID: \"cb697396-e88e-4780-9f6a-2109bfc21e0f\") " pod="openshift-image-registry/node-ca-89jl9" Apr 16 13:57:07.367289 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366891 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-host-run-k8s-cni-cncf-io\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.367289 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366935 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-host-run-k8s-cni-cncf-io\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.367289 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366942 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/85abdd4c-8c23-4bf2-b9a2-a5e83b75807a-cni-binary-copy\") pod \"multus-additional-cni-plugins-xjl2f\" (UID: \"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a\") " pod="openshift-multus/multus-additional-cni-plugins-xjl2f" Apr 16 13:57:07.368088 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366947 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-multus-socket-dir-parent\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.368088 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366974 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-node-log\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.368088 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.366993 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-host-run-multus-certs\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.368088 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367016 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/04113a02-0dc7-42c8-a11b-4684fb794c4f-ovnkube-script-lib\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.368088 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367026 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbvt8\" (UniqueName: \"kubernetes.io/projected/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-kube-api-access-cbvt8\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.368088 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367074 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-host-run-multus-certs\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.368088 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367085 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7156eacb-925d-44f8-8084-e8c54e35371b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hl7jg\" (UID: \"7156eacb-925d-44f8-8084-e8c54e35371b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" Apr 16 13:57:07.368088 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367195 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-etc-sysctl-d\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.368088 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367206 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/85abdd4c-8c23-4bf2-b9a2-a5e83b75807a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xjl2f\" (UID: \"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a\") " pod="openshift-multus/multus-additional-cni-plugins-xjl2f" Apr 16 13:57:07.368088 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367222 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-run\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.368088 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367256 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-host-run-netns\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.368088 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367278 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs\") pod \"network-metrics-daemon-crlsp\" (UID: \"77d59171-3e29-4e55-a4d9-a076a67a50ce\") " pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:07.368088 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367307 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-host-run-netns\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.368088 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367325 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/85abdd4c-8c23-4bf2-b9a2-a5e83b75807a-os-release\") pod \"multus-additional-cni-plugins-xjl2f\" (UID: \"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a\") " pod="openshift-multus/multus-additional-cni-plugins-xjl2f" Apr 16 13:57:07.368088 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367352 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-system-cni-dir\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.368088 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367379 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7156eacb-925d-44f8-8084-e8c54e35371b-socket-dir\") pod \"aws-ebs-csi-driver-node-hl7jg\" (UID: \"7156eacb-925d-44f8-8084-e8c54e35371b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" Apr 16 13:57:07.368088 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367400 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/85abdd4c-8c23-4bf2-b9a2-a5e83b75807a-os-release\") pod \"multus-additional-cni-plugins-xjl2f\" (UID: \"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a\") " pod="openshift-multus/multus-additional-cni-plugins-xjl2f" Apr 16 13:57:07.368088 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:07.367411 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:07.368906 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367420 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/fa435174-0b32-4191-b42c-ad32bd3bc5db-konnectivity-ca\") pod \"konnectivity-agent-cr2k7\" (UID: \"fa435174-0b32-4191-b42c-ad32bd3bc5db\") " pod="kube-system/konnectivity-agent-cr2k7" Apr 16 13:57:07.368906 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367442 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-system-cni-dir\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.368906 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367447 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-var-lib-openvswitch\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.368906 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:07.367484 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs podName:77d59171-3e29-4e55-a4d9-a076a67a50ce nodeName:}" failed. No retries permitted until 2026-04-16 13:57:07.867456327 +0000 UTC m=+2.045705177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs") pod "network-metrics-daemon-crlsp" (UID: "77d59171-3e29-4e55-a4d9-a076a67a50ce") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:07.368906 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367494 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-var-lib-openvswitch\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.368906 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367524 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/04113a02-0dc7-42c8-a11b-4684fb794c4f-ovnkube-config\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.368906 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367555 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-multus-cni-dir\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.368906 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367579 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-cnibin\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.368906 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367605 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-etc-sysconfig\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.368906 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367649 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-multus-cni-dir\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.368906 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367631 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-host\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.368906 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367677 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-cnibin\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.368906 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367693 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/87cef1d3-c711-4f53-a775-8b55ec1bcf86-etc-tuned\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.368906 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367717 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-systemd-units\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.368906 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367762 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-run-systemd\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.368906 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367788 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.368906 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367814 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/fa435174-0b32-4191-b42c-ad32bd3bc5db-agent-certs\") pod \"konnectivity-agent-cr2k7\" (UID: \"fa435174-0b32-4191-b42c-ad32bd3bc5db\") " pod="kube-system/konnectivity-agent-cr2k7" Apr 16 13:57:07.369436 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367832 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-systemd-units\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.369436 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367840 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7156eacb-925d-44f8-8084-e8c54e35371b-registration-dir\") pod \"aws-ebs-csi-driver-node-hl7jg\" (UID: \"7156eacb-925d-44f8-8084-e8c54e35371b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" Apr 16 13:57:07.369436 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367867 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-log-socket\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.369436 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367870 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/fa435174-0b32-4191-b42c-ad32bd3bc5db-konnectivity-ca\") pod \"konnectivity-agent-cr2k7\" (UID: \"fa435174-0b32-4191-b42c-ad32bd3bc5db\") " pod="kube-system/konnectivity-agent-cr2k7" Apr 16 13:57:07.369436 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367869 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.369436 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367836 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-run-systemd\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.369436 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367909 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1b680da6-ab85-4c31-98d8-35be4b07624b-tmp-dir\") pod \"node-resolver-hvqjp\" (UID: \"1b680da6-ab85-4c31-98d8-35be4b07624b\") " pod="openshift-dns/node-resolver-hvqjp" Apr 16 13:57:07.369436 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367924 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-log-socket\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.369436 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.367946 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/85abdd4c-8c23-4bf2-b9a2-a5e83b75807a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xjl2f\" (UID: \"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a\") " pod="openshift-multus/multus-additional-cni-plugins-xjl2f" Apr 16 13:57:07.369436 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368012 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7156eacb-925d-44f8-8084-e8c54e35371b-sys-fs\") pod \"aws-ebs-csi-driver-node-hl7jg\" (UID: \"7156eacb-925d-44f8-8084-e8c54e35371b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" Apr 16 13:57:07.369436 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368040 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-etc-modprobe-d\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.369436 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368059 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/04113a02-0dc7-42c8-a11b-4684fb794c4f-ovnkube-config\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.369436 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368072 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-lib-modules\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.369436 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368123 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m4z9\" (UniqueName: \"kubernetes.io/projected/77d59171-3e29-4e55-a4d9-a076a67a50ce-kube-api-access-5m4z9\") pod \"network-metrics-daemon-crlsp\" (UID: \"77d59171-3e29-4e55-a4d9-a076a67a50ce\") " pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:07.369436 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368172 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7156eacb-925d-44f8-8084-e8c54e35371b-device-dir\") pod \"aws-ebs-csi-driver-node-hl7jg\" (UID: \"7156eacb-925d-44f8-8084-e8c54e35371b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" Apr 16 13:57:07.369436 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368201 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-etc-openvswitch\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.369436 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368226 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/04113a02-0dc7-42c8-a11b-4684fb794c4f-env-overrides\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.369923 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368252 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/85abdd4c-8c23-4bf2-b9a2-a5e83b75807a-system-cni-dir\") pod \"multus-additional-cni-plugins-xjl2f\" (UID: \"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a\") " pod="openshift-multus/multus-additional-cni-plugins-xjl2f" Apr 16 13:57:07.369923 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368262 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-etc-openvswitch\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.369923 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368280 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7156eacb-925d-44f8-8084-e8c54e35371b-etc-selinux\") pod \"aws-ebs-csi-driver-node-hl7jg\" (UID: \"7156eacb-925d-44f8-8084-e8c54e35371b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" Apr 16 13:57:07.369923 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368307 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-etc-kubernetes\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.369923 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368329 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/85abdd4c-8c23-4bf2-b9a2-a5e83b75807a-system-cni-dir\") pod \"multus-additional-cni-plugins-xjl2f\" (UID: \"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a\") " pod="openshift-multus/multus-additional-cni-plugins-xjl2f" Apr 16 13:57:07.369923 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368331 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-etc-sysctl-conf\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.369923 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368351 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/85abdd4c-8c23-4bf2-b9a2-a5e83b75807a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xjl2f\" (UID: \"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a\") " pod="openshift-multus/multus-additional-cni-plugins-xjl2f" Apr 16 13:57:07.369923 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368377 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-host-kubelet\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.369923 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368469 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/04113a02-0dc7-42c8-a11b-4684fb794c4f-ovn-node-metrics-cert\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.369923 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368497 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/85abdd4c-8c23-4bf2-b9a2-a5e83b75807a-cnibin\") pod \"multus-additional-cni-plugins-xjl2f\" (UID: \"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a\") " pod="openshift-multus/multus-additional-cni-plugins-xjl2f" Apr 16 13:57:07.369923 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368471 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-host-kubelet\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.369923 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368523 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-cni-binary-copy\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.369923 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368566 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 13:57:07.369923 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368577 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/85abdd4c-8c23-4bf2-b9a2-a5e83b75807a-cnibin\") pod \"multus-additional-cni-plugins-xjl2f\" (UID: \"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a\") " pod="openshift-multus/multus-additional-cni-plugins-xjl2f" Apr 16 13:57:07.369923 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368606 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl5zd\" (UniqueName: \"kubernetes.io/projected/87cef1d3-c711-4f53-a775-8b55ec1bcf86-kube-api-access-nl5zd\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.369923 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368628 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1b680da6-ab85-4c31-98d8-35be4b07624b-tmp-dir\") pod \"node-resolver-hvqjp\" (UID: \"1b680da6-ab85-4c31-98d8-35be4b07624b\") " pod="openshift-dns/node-resolver-hvqjp" Apr 16 13:57:07.369923 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368637 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjl4q\" (UniqueName: \"kubernetes.io/projected/b45111e6-3682-445c-ac82-d2870a0cac78-kube-api-access-fjl4q\") pod \"network-check-target-96bs9\" (UID: \"b45111e6-3682-445c-ac82-d2870a0cac78\") " pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:57:07.370363 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368659 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/04113a02-0dc7-42c8-a11b-4684fb794c4f-env-overrides\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.370363 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368669 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84xv2\" (UniqueName: \"kubernetes.io/projected/04113a02-0dc7-42c8-a11b-4684fb794c4f-kube-api-access-84xv2\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.370363 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368706 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-sys\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.370363 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368748 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-var-lib-kubelet\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.370363 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368799 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-host-slash\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.370363 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.368851 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/04113a02-0dc7-42c8-a11b-4684fb794c4f-host-slash\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.370363 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.369015 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-cni-binary-copy\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.371490 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.371474 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/04113a02-0dc7-42c8-a11b-4684fb794c4f-ovn-node-metrics-cert\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.371554 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.371543 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/fa435174-0b32-4191-b42c-ad32bd3bc5db-agent-certs\") pod \"konnectivity-agent-cr2k7\" (UID: \"fa435174-0b32-4191-b42c-ad32bd3bc5db\") " pod="kube-system/konnectivity-agent-cr2k7" Apr 16 13:57:07.376073 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:07.376049 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:57:07.376073 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:07.376076 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:57:07.376262 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:07.376090 2575 projected.go:194] Error preparing data for projected volume kube-api-access-fjl4q for pod openshift-network-diagnostics/network-check-target-96bs9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:07.376465 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:07.376446 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b45111e6-3682-445c-ac82-d2870a0cac78-kube-api-access-fjl4q podName:b45111e6-3682-445c-ac82-d2870a0cac78 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:07.876422937 +0000 UTC m=+2.054671803 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-fjl4q" (UniqueName: "kubernetes.io/projected/b45111e6-3682-445c-ac82-d2870a0cac78-kube-api-access-fjl4q") pod "network-check-target-96bs9" (UID: "b45111e6-3682-445c-ac82-d2870a0cac78") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:07.380083 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.379572 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m4z9\" (UniqueName: \"kubernetes.io/projected/77d59171-3e29-4e55-a4d9-a076a67a50ce-kube-api-access-5m4z9\") pod \"network-metrics-daemon-crlsp\" (UID: \"77d59171-3e29-4e55-a4d9-a076a67a50ce\") " pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:07.380083 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.379600 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s4lp\" (UniqueName: \"kubernetes.io/projected/1b680da6-ab85-4c31-98d8-35be4b07624b-kube-api-access-5s4lp\") pod \"node-resolver-hvqjp\" (UID: \"1b680da6-ab85-4c31-98d8-35be4b07624b\") " pod="openshift-dns/node-resolver-hvqjp" Apr 16 13:57:07.380083 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.379605 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbvt8\" (UniqueName: \"kubernetes.io/projected/b3cb1a86-beca-4c98-9d1a-b08d033e57ac-kube-api-access-cbvt8\") pod \"multus-mmczn\" (UID: \"b3cb1a86-beca-4c98-9d1a-b08d033e57ac\") " pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.380083 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.379682 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc2nn\" (UniqueName: \"kubernetes.io/projected/cb697396-e88e-4780-9f6a-2109bfc21e0f-kube-api-access-qc2nn\") pod \"node-ca-89jl9\" (UID: \"cb697396-e88e-4780-9f6a-2109bfc21e0f\") " pod="openshift-image-registry/node-ca-89jl9" Apr 16 13:57:07.380083 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.379979 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgp8f\" (UniqueName: \"kubernetes.io/projected/0a0fcf8a-7d68-4b75-b145-75ba1622662d-kube-api-access-tgp8f\") pod \"iptables-alerter-5kvpq\" (UID: \"0a0fcf8a-7d68-4b75-b145-75ba1622662d\") " pod="openshift-network-operator/iptables-alerter-5kvpq" Apr 16 13:57:07.381516 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.381498 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9xzl\" (UniqueName: \"kubernetes.io/projected/85abdd4c-8c23-4bf2-b9a2-a5e83b75807a-kube-api-access-m9xzl\") pod \"multus-additional-cni-plugins-xjl2f\" (UID: \"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a\") " pod="openshift-multus/multus-additional-cni-plugins-xjl2f" Apr 16 13:57:07.381939 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.381924 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84xv2\" (UniqueName: \"kubernetes.io/projected/04113a02-0dc7-42c8-a11b-4684fb794c4f-kube-api-access-84xv2\") pod \"ovnkube-node-qv8tl\" (UID: \"04113a02-0dc7-42c8-a11b-4684fb794c4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.469622 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.469590 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-sys\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.469622 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.469624 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-var-lib-kubelet\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.469854 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.469646 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/87cef1d3-c711-4f53-a775-8b55ec1bcf86-tmp\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.469854 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.469672 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wb744\" (UniqueName: \"kubernetes.io/projected/7156eacb-925d-44f8-8084-e8c54e35371b-kube-api-access-wb744\") pod \"aws-ebs-csi-driver-node-hl7jg\" (UID: \"7156eacb-925d-44f8-8084-e8c54e35371b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" Apr 16 13:57:07.469854 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.469700 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-etc-systemd\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.469854 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.469714 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-sys\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.469854 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.469742 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7156eacb-925d-44f8-8084-e8c54e35371b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hl7jg\" (UID: \"7156eacb-925d-44f8-8084-e8c54e35371b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" Apr 16 13:57:07.469854 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.469742 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-var-lib-kubelet\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.469854 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.469770 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-etc-sysctl-d\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.469854 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.469784 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-etc-systemd\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.469854 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.469808 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7156eacb-925d-44f8-8084-e8c54e35371b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hl7jg\" (UID: \"7156eacb-925d-44f8-8084-e8c54e35371b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" Apr 16 13:57:07.469854 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.469815 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-run\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.469854 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.469863 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-run\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.470294 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.469871 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-etc-sysctl-d\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.470294 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.469874 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7156eacb-925d-44f8-8084-e8c54e35371b-socket-dir\") pod \"aws-ebs-csi-driver-node-hl7jg\" (UID: \"7156eacb-925d-44f8-8084-e8c54e35371b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" Apr 16 13:57:07.470294 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.469924 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-etc-sysconfig\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.470294 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.469946 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-host\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.470294 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.469984 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7156eacb-925d-44f8-8084-e8c54e35371b-socket-dir\") pod \"aws-ebs-csi-driver-node-hl7jg\" (UID: \"7156eacb-925d-44f8-8084-e8c54e35371b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" Apr 16 13:57:07.470294 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.469991 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-host\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.470294 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.470028 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-etc-sysconfig\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.470294 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.470065 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/87cef1d3-c711-4f53-a775-8b55ec1bcf86-etc-tuned\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.470294 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.470099 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7156eacb-925d-44f8-8084-e8c54e35371b-registration-dir\") pod \"aws-ebs-csi-driver-node-hl7jg\" (UID: \"7156eacb-925d-44f8-8084-e8c54e35371b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" Apr 16 13:57:07.470294 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.470121 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7156eacb-925d-44f8-8084-e8c54e35371b-sys-fs\") pod \"aws-ebs-csi-driver-node-hl7jg\" (UID: \"7156eacb-925d-44f8-8084-e8c54e35371b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" Apr 16 13:57:07.470294 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.470140 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-etc-modprobe-d\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.470294 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.470154 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-lib-modules\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.470294 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.470170 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7156eacb-925d-44f8-8084-e8c54e35371b-device-dir\") pod \"aws-ebs-csi-driver-node-hl7jg\" (UID: \"7156eacb-925d-44f8-8084-e8c54e35371b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" Apr 16 13:57:07.470294 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.470196 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7156eacb-925d-44f8-8084-e8c54e35371b-etc-selinux\") pod \"aws-ebs-csi-driver-node-hl7jg\" (UID: \"7156eacb-925d-44f8-8084-e8c54e35371b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" Apr 16 13:57:07.470294 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.470194 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7156eacb-925d-44f8-8084-e8c54e35371b-registration-dir\") pod \"aws-ebs-csi-driver-node-hl7jg\" (UID: \"7156eacb-925d-44f8-8084-e8c54e35371b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" Apr 16 13:57:07.470294 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.470218 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-etc-kubernetes\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.470294 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.470236 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7156eacb-925d-44f8-8084-e8c54e35371b-device-dir\") pod \"aws-ebs-csi-driver-node-hl7jg\" (UID: \"7156eacb-925d-44f8-8084-e8c54e35371b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" Apr 16 13:57:07.470862 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.470239 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-etc-sysctl-conf\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.470862 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.470285 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nl5zd\" (UniqueName: \"kubernetes.io/projected/87cef1d3-c711-4f53-a775-8b55ec1bcf86-kube-api-access-nl5zd\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.470862 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.470306 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-etc-modprobe-d\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.470862 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.470313 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-lib-modules\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.470862 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.470344 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-etc-sysctl-conf\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.470862 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.470195 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7156eacb-925d-44f8-8084-e8c54e35371b-sys-fs\") pod \"aws-ebs-csi-driver-node-hl7jg\" (UID: \"7156eacb-925d-44f8-8084-e8c54e35371b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" Apr 16 13:57:07.470862 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.470382 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7156eacb-925d-44f8-8084-e8c54e35371b-etc-selinux\") pod \"aws-ebs-csi-driver-node-hl7jg\" (UID: \"7156eacb-925d-44f8-8084-e8c54e35371b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" Apr 16 13:57:07.470862 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.470388 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87cef1d3-c711-4f53-a775-8b55ec1bcf86-etc-kubernetes\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.471918 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.471899 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/87cef1d3-c711-4f53-a775-8b55ec1bcf86-tmp\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.472038 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.472019 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/87cef1d3-c711-4f53-a775-8b55ec1bcf86-etc-tuned\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.478030 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.478003 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb744\" (UniqueName: \"kubernetes.io/projected/7156eacb-925d-44f8-8084-e8c54e35371b-kube-api-access-wb744\") pod \"aws-ebs-csi-driver-node-hl7jg\" (UID: \"7156eacb-925d-44f8-8084-e8c54e35371b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" Apr 16 13:57:07.478150 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.478137 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl5zd\" (UniqueName: \"kubernetes.io/projected/87cef1d3-c711-4f53-a775-8b55ec1bcf86-kube-api-access-nl5zd\") pod \"tuned-xfspm\" (UID: \"87cef1d3-c711-4f53-a775-8b55ec1bcf86\") " pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.583443 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.583364 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hvqjp" Apr 16 13:57:07.588113 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.588096 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-89jl9" Apr 16 13:57:07.604681 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.604656 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5kvpq" Apr 16 13:57:07.609224 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.609201 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xjl2f" Apr 16 13:57:07.631818 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.631787 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mmczn" Apr 16 13:57:07.651540 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.651503 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:07.664243 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.664213 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-cr2k7" Apr 16 13:57:07.685357 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.685326 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:57:07.690295 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.690275 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" Apr 16 13:57:07.695188 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.695158 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xfspm" Apr 16 13:57:07.873750 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.873654 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs\") pod \"network-metrics-daemon-crlsp\" (UID: \"77d59171-3e29-4e55-a4d9-a076a67a50ce\") " pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:07.873872 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:07.873771 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:07.873872 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:07.873821 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs podName:77d59171-3e29-4e55-a4d9-a076a67a50ce nodeName:}" failed. No retries permitted until 2026-04-16 13:57:08.873808198 +0000 UTC m=+3.052057043 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs") pod "network-metrics-daemon-crlsp" (UID: "77d59171-3e29-4e55-a4d9-a076a67a50ce") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:07.974309 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:07.974280 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjl4q\" (UniqueName: \"kubernetes.io/projected/b45111e6-3682-445c-ac82-d2870a0cac78-kube-api-access-fjl4q\") pod \"network-check-target-96bs9\" (UID: \"b45111e6-3682-445c-ac82-d2870a0cac78\") " pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:57:07.974550 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:07.974431 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:57:07.974550 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:07.974452 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:57:07.974550 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:07.974462 2575 projected.go:194] Error preparing data for projected volume kube-api-access-fjl4q for pod openshift-network-diagnostics/network-check-target-96bs9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:07.974550 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:07.974513 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b45111e6-3682-445c-ac82-d2870a0cac78-kube-api-access-fjl4q podName:b45111e6-3682-445c-ac82-d2870a0cac78 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:08.974499399 +0000 UTC m=+3.152748244 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-fjl4q" (UniqueName: "kubernetes.io/projected/b45111e6-3682-445c-ac82-d2870a0cac78-kube-api-access-fjl4q") pod "network-check-target-96bs9" (UID: "b45111e6-3682-445c-ac82-d2870a0cac78") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:08.026331 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:08.026308 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:57:08.035200 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:08.035158 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bcc61cd4afd68fd131f750c02f2018e.slice/crio-ce60f2acecbfe9e9671ef55e53a547acd6dc6cf95d95215fa2da8ec1154adef7 WatchSource:0}: Error finding container ce60f2acecbfe9e9671ef55e53a547acd6dc6cf95d95215fa2da8ec1154adef7: Status 404 returned error can't find the container with id ce60f2acecbfe9e9671ef55e53a547acd6dc6cf95d95215fa2da8ec1154adef7 Apr 16 13:57:08.035402 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:08.035376 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb697396_e88e_4780_9f6a_2109bfc21e0f.slice/crio-8aac3c9716f92e36e197b4774ec49170fbbddde35850f4ebd9892dc8fb378de3 WatchSource:0}: Error finding container 8aac3c9716f92e36e197b4774ec49170fbbddde35850f4ebd9892dc8fb378de3: Status 404 returned error can't find the container with id 8aac3c9716f92e36e197b4774ec49170fbbddde35850f4ebd9892dc8fb378de3 Apr 16 13:57:08.035771 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:08.035750 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04113a02_0dc7_42c8_a11b_4684fb794c4f.slice/crio-bdb9d427d02de5afc955763ef8df0c881b54c54b280f0d7be2506277ec40090f WatchSource:0}: Error finding container bdb9d427d02de5afc955763ef8df0c881b54c54b280f0d7be2506277ec40090f: Status 404 returned error can't find the container with id bdb9d427d02de5afc955763ef8df0c881b54c54b280f0d7be2506277ec40090f Apr 16 13:57:08.038967 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:08.038952 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:57:08.305922 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:08.305814 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:52:07 +0000 UTC" deadline="2027-12-04 06:57:48.970627198 +0000 UTC" Apr 16 13:57:08.305922 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:08.305861 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14321h0m40.664770501s" Apr 16 13:57:08.351017 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:08.350983 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa435174_0b32_4191_b42c_ad32bd3bc5db.slice/crio-7cf3ee59e9249da482a42490e0a6b3d44ed30b0ae2c1b52e268ea49c82fb3e4b WatchSource:0}: Error finding container 7cf3ee59e9249da482a42490e0a6b3d44ed30b0ae2c1b52e268ea49c82fb3e4b: Status 404 returned error can't find the container with id 7cf3ee59e9249da482a42490e0a6b3d44ed30b0ae2c1b52e268ea49c82fb3e4b Apr 16 13:57:08.386825 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:08.386791 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:57:08.386989 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:08.386893 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-96bs9" podUID="b45111e6-3682-445c-ac82-d2870a0cac78" Apr 16 13:57:08.390493 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:08.390437 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-cr2k7" event={"ID":"fa435174-0b32-4191-b42c-ad32bd3bc5db","Type":"ContainerStarted","Data":"7cf3ee59e9249da482a42490e0a6b3d44ed30b0ae2c1b52e268ea49c82fb3e4b"} Apr 16 13:57:08.391145 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:08.391113 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-131.ec2.internal" event={"ID":"8bcc61cd4afd68fd131f750c02f2018e","Type":"ContainerStarted","Data":"ce60f2acecbfe9e9671ef55e53a547acd6dc6cf95d95215fa2da8ec1154adef7"} Apr 16 13:57:08.392421 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:08.392391 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" event={"ID":"04113a02-0dc7-42c8-a11b-4684fb794c4f","Type":"ContainerStarted","Data":"bdb9d427d02de5afc955763ef8df0c881b54c54b280f0d7be2506277ec40090f"} Apr 16 13:57:08.393479 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:08.393455 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-89jl9" event={"ID":"cb697396-e88e-4780-9f6a-2109bfc21e0f","Type":"ContainerStarted","Data":"8aac3c9716f92e36e197b4774ec49170fbbddde35850f4ebd9892dc8fb378de3"} Apr 16 13:57:08.646785 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:08.646709 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3cb1a86_beca_4c98_9d1a_b08d033e57ac.slice/crio-b71f2aba0bd6f64deb13c3e48f304e058c7474f93e14232b41e06a59623fb7aa WatchSource:0}: Error finding container b71f2aba0bd6f64deb13c3e48f304e058c7474f93e14232b41e06a59623fb7aa: Status 404 returned error can't find the container with id b71f2aba0bd6f64deb13c3e48f304e058c7474f93e14232b41e06a59623fb7aa Apr 16 13:57:08.673369 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:08.673329 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7156eacb_925d_44f8_8084_e8c54e35371b.slice/crio-9ed2f5b9110e81a3861bb85b1d886ddf4cf6477828882d6c1ff50960e88096a5 WatchSource:0}: Error finding container 9ed2f5b9110e81a3861bb85b1d886ddf4cf6477828882d6c1ff50960e88096a5: Status 404 returned error can't find the container with id 9ed2f5b9110e81a3861bb85b1d886ddf4cf6477828882d6c1ff50960e88096a5 Apr 16 13:57:08.765157 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:08.765118 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a0fcf8a_7d68_4b75_b145_75ba1622662d.slice/crio-9d477439a72b1d191eae69d45051b4eb3ada037e51d0d5c66ea221df1465ac72 WatchSource:0}: Error finding container 9d477439a72b1d191eae69d45051b4eb3ada037e51d0d5c66ea221df1465ac72: Status 404 returned error can't find the container with id 9d477439a72b1d191eae69d45051b4eb3ada037e51d0d5c66ea221df1465ac72 Apr 16 13:57:08.879645 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:08.879610 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs\") pod \"network-metrics-daemon-crlsp\" (UID: \"77d59171-3e29-4e55-a4d9-a076a67a50ce\") " pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:08.879860 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:08.879827 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:08.879932 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:08.879915 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs podName:77d59171-3e29-4e55-a4d9-a076a67a50ce nodeName:}" failed. No retries permitted until 2026-04-16 13:57:10.879896676 +0000 UTC m=+5.058145526 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs") pod "network-metrics-daemon-crlsp" (UID: "77d59171-3e29-4e55-a4d9-a076a67a50ce") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:08.956402 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:08.956315 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:57:08.980567 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:08.980523 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjl4q\" (UniqueName: \"kubernetes.io/projected/b45111e6-3682-445c-ac82-d2870a0cac78-kube-api-access-fjl4q\") pod \"network-check-target-96bs9\" (UID: \"b45111e6-3682-445c-ac82-d2870a0cac78\") " pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:57:08.980823 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:08.980771 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:57:08.980823 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:08.980798 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:57:08.980823 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:08.980811 2575 projected.go:194] Error preparing data for projected volume kube-api-access-fjl4q for pod openshift-network-diagnostics/network-check-target-96bs9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:08.981019 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:08.980878 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b45111e6-3682-445c-ac82-d2870a0cac78-kube-api-access-fjl4q podName:b45111e6-3682-445c-ac82-d2870a0cac78 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:10.980856992 +0000 UTC m=+5.159105839 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-fjl4q" (UniqueName: "kubernetes.io/projected/b45111e6-3682-445c-ac82-d2870a0cac78-kube-api-access-fjl4q") pod "network-check-target-96bs9" (UID: "b45111e6-3682-445c-ac82-d2870a0cac78") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:09.306065 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:09.305974 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:52:07 +0000 UTC" deadline="2028-01-20 21:29:47.144262358 +0000 UTC" Apr 16 13:57:09.306065 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:09.306016 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15463h32m37.838250434s" Apr 16 13:57:09.386704 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:09.386672 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:09.386864 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:09.386827 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crlsp" podUID="77d59171-3e29-4e55-a4d9-a076a67a50ce" Apr 16 13:57:09.396086 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:09.396030 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" event={"ID":"7156eacb-925d-44f8-8084-e8c54e35371b","Type":"ContainerStarted","Data":"9ed2f5b9110e81a3861bb85b1d886ddf4cf6477828882d6c1ff50960e88096a5"} Apr 16 13:57:09.397318 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:09.397290 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mmczn" event={"ID":"b3cb1a86-beca-4c98-9d1a-b08d033e57ac","Type":"ContainerStarted","Data":"b71f2aba0bd6f64deb13c3e48f304e058c7474f93e14232b41e06a59623fb7aa"} Apr 16 13:57:09.398413 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:09.398383 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5kvpq" event={"ID":"0a0fcf8a-7d68-4b75-b145-75ba1622662d","Type":"ContainerStarted","Data":"9d477439a72b1d191eae69d45051b4eb3ada037e51d0d5c66ea221df1465ac72"} Apr 16 13:57:09.399436 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:09.399410 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-131.ec2.internal" event={"ID":"d8d2889d114a061679832b8c70f242a6","Type":"ContainerStarted","Data":"f6d9d19800426c3db910c8211e00d9fdf536fa14a8f5b6e04b5f1a1b41dd8c48"} Apr 16 13:57:09.984588 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:09.984546 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b680da6_ab85_4c31_98d8_35be4b07624b.slice/crio-8000c861a632a5c4a122db86059fb20c2d90bd1655284ce05f004a69716e8990 WatchSource:0}: Error finding container 8000c861a632a5c4a122db86059fb20c2d90bd1655284ce05f004a69716e8990: Status 404 returned error can't find the container with id 8000c861a632a5c4a122db86059fb20c2d90bd1655284ce05f004a69716e8990 Apr 16 13:57:10.388964 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:10.388881 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:57:10.389400 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:10.389006 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-96bs9" podUID="b45111e6-3682-445c-ac82-d2870a0cac78" Apr 16 13:57:10.409769 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:10.409715 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hvqjp" event={"ID":"1b680da6-ab85-4c31-98d8-35be4b07624b","Type":"ContainerStarted","Data":"8000c861a632a5c4a122db86059fb20c2d90bd1655284ce05f004a69716e8990"} Apr 16 13:57:10.894399 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:10.894364 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs\") pod \"network-metrics-daemon-crlsp\" (UID: \"77d59171-3e29-4e55-a4d9-a076a67a50ce\") " pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:10.894563 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:10.894540 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:10.894628 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:10.894604 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs podName:77d59171-3e29-4e55-a4d9-a076a67a50ce nodeName:}" failed. No retries permitted until 2026-04-16 13:57:14.894585183 +0000 UTC m=+9.072834047 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs") pod "network-metrics-daemon-crlsp" (UID: "77d59171-3e29-4e55-a4d9-a076a67a50ce") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:10.989941 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:10.989907 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-5wxx6"] Apr 16 13:57:10.991797 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:10.991774 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:10.991932 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:10.991855 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5wxx6" podUID="13c9904e-5fed-4ef4-845b-0a77e68bc8f7" Apr 16 13:57:10.995321 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:10.995026 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjl4q\" (UniqueName: \"kubernetes.io/projected/b45111e6-3682-445c-ac82-d2870a0cac78-kube-api-access-fjl4q\") pod \"network-check-target-96bs9\" (UID: \"b45111e6-3682-445c-ac82-d2870a0cac78\") " pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:57:10.995321 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:10.995200 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:57:10.995321 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:10.995220 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:57:10.995321 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:10.995233 2575 projected.go:194] Error preparing data for projected volume kube-api-access-fjl4q for pod openshift-network-diagnostics/network-check-target-96bs9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:10.995321 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:10.995284 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b45111e6-3682-445c-ac82-d2870a0cac78-kube-api-access-fjl4q podName:b45111e6-3682-445c-ac82-d2870a0cac78 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:14.995266829 +0000 UTC m=+9.173515681 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-fjl4q" (UniqueName: "kubernetes.io/projected/b45111e6-3682-445c-ac82-d2870a0cac78-kube-api-access-fjl4q") pod "network-check-target-96bs9" (UID: "b45111e6-3682-445c-ac82-d2870a0cac78") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:11.095815 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:11.095577 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/13c9904e-5fed-4ef4-845b-0a77e68bc8f7-dbus\") pod \"global-pull-secret-syncer-5wxx6\" (UID: \"13c9904e-5fed-4ef4-845b-0a77e68bc8f7\") " pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:11.095815 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:11.095657 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/13c9904e-5fed-4ef4-845b-0a77e68bc8f7-original-pull-secret\") pod \"global-pull-secret-syncer-5wxx6\" (UID: \"13c9904e-5fed-4ef4-845b-0a77e68bc8f7\") " pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:11.095815 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:11.095705 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/13c9904e-5fed-4ef4-845b-0a77e68bc8f7-kubelet-config\") pod \"global-pull-secret-syncer-5wxx6\" (UID: \"13c9904e-5fed-4ef4-845b-0a77e68bc8f7\") " pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:11.196462 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:11.196373 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/13c9904e-5fed-4ef4-845b-0a77e68bc8f7-dbus\") pod \"global-pull-secret-syncer-5wxx6\" (UID: \"13c9904e-5fed-4ef4-845b-0a77e68bc8f7\") " pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:11.196462 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:11.196450 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/13c9904e-5fed-4ef4-845b-0a77e68bc8f7-original-pull-secret\") pod \"global-pull-secret-syncer-5wxx6\" (UID: \"13c9904e-5fed-4ef4-845b-0a77e68bc8f7\") " pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:11.196685 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:11.196488 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/13c9904e-5fed-4ef4-845b-0a77e68bc8f7-kubelet-config\") pod \"global-pull-secret-syncer-5wxx6\" (UID: \"13c9904e-5fed-4ef4-845b-0a77e68bc8f7\") " pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:11.196685 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:11.196601 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/13c9904e-5fed-4ef4-845b-0a77e68bc8f7-kubelet-config\") pod \"global-pull-secret-syncer-5wxx6\" (UID: \"13c9904e-5fed-4ef4-845b-0a77e68bc8f7\") " pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:11.196848 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:11.196757 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/13c9904e-5fed-4ef4-845b-0a77e68bc8f7-dbus\") pod \"global-pull-secret-syncer-5wxx6\" (UID: \"13c9904e-5fed-4ef4-845b-0a77e68bc8f7\") " pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:11.196905 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:11.196859 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:57:11.196957 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:11.196917 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13c9904e-5fed-4ef4-845b-0a77e68bc8f7-original-pull-secret podName:13c9904e-5fed-4ef4-845b-0a77e68bc8f7 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:11.696898348 +0000 UTC m=+5.875147208 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/13c9904e-5fed-4ef4-845b-0a77e68bc8f7-original-pull-secret") pod "global-pull-secret-syncer-5wxx6" (UID: "13c9904e-5fed-4ef4-845b-0a77e68bc8f7") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:57:11.388114 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:11.388076 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:11.388296 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:11.388216 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crlsp" podUID="77d59171-3e29-4e55-a4d9-a076a67a50ce" Apr 16 13:57:11.700870 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:11.700831 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/13c9904e-5fed-4ef4-845b-0a77e68bc8f7-original-pull-secret\") pod \"global-pull-secret-syncer-5wxx6\" (UID: \"13c9904e-5fed-4ef4-845b-0a77e68bc8f7\") " pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:11.701332 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:11.700982 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:57:11.701332 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:11.701044 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13c9904e-5fed-4ef4-845b-0a77e68bc8f7-original-pull-secret podName:13c9904e-5fed-4ef4-845b-0a77e68bc8f7 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:12.701025878 +0000 UTC m=+6.879274743 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/13c9904e-5fed-4ef4-845b-0a77e68bc8f7-original-pull-secret") pod "global-pull-secret-syncer-5wxx6" (UID: "13c9904e-5fed-4ef4-845b-0a77e68bc8f7") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:57:12.387344 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:12.387310 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:57:12.387530 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:12.387447 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-96bs9" podUID="b45111e6-3682-445c-ac82-d2870a0cac78" Apr 16 13:57:12.710244 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:12.710159 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/13c9904e-5fed-4ef4-845b-0a77e68bc8f7-original-pull-secret\") pod \"global-pull-secret-syncer-5wxx6\" (UID: \"13c9904e-5fed-4ef4-845b-0a77e68bc8f7\") " pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:12.710700 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:12.710350 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:57:12.710700 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:12.710423 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13c9904e-5fed-4ef4-845b-0a77e68bc8f7-original-pull-secret podName:13c9904e-5fed-4ef4-845b-0a77e68bc8f7 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:14.710403491 +0000 UTC m=+8.888652354 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/13c9904e-5fed-4ef4-845b-0a77e68bc8f7-original-pull-secret") pod "global-pull-secret-syncer-5wxx6" (UID: "13c9904e-5fed-4ef4-845b-0a77e68bc8f7") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:57:13.386118 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:13.386082 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:13.386296 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:13.386228 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5wxx6" podUID="13c9904e-5fed-4ef4-845b-0a77e68bc8f7" Apr 16 13:57:13.386811 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:13.386651 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:13.386811 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:13.386769 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crlsp" podUID="77d59171-3e29-4e55-a4d9-a076a67a50ce" Apr 16 13:57:14.389464 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:14.388980 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:57:14.389464 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:14.389101 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-96bs9" podUID="b45111e6-3682-445c-ac82-d2870a0cac78" Apr 16 13:57:14.726360 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:14.726207 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/13c9904e-5fed-4ef4-845b-0a77e68bc8f7-original-pull-secret\") pod \"global-pull-secret-syncer-5wxx6\" (UID: \"13c9904e-5fed-4ef4-845b-0a77e68bc8f7\") " pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:14.726527 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:14.726379 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:57:14.726527 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:14.726465 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13c9904e-5fed-4ef4-845b-0a77e68bc8f7-original-pull-secret podName:13c9904e-5fed-4ef4-845b-0a77e68bc8f7 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:18.726442461 +0000 UTC m=+12.904691313 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/13c9904e-5fed-4ef4-845b-0a77e68bc8f7-original-pull-secret") pod "global-pull-secret-syncer-5wxx6" (UID: "13c9904e-5fed-4ef4-845b-0a77e68bc8f7") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:57:14.927809 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:14.927772 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs\") pod \"network-metrics-daemon-crlsp\" (UID: \"77d59171-3e29-4e55-a4d9-a076a67a50ce\") " pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:14.928006 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:14.927981 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:14.928125 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:14.928049 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs podName:77d59171-3e29-4e55-a4d9-a076a67a50ce nodeName:}" failed. No retries permitted until 2026-04-16 13:57:22.928030759 +0000 UTC m=+17.106279611 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs") pod "network-metrics-daemon-crlsp" (UID: "77d59171-3e29-4e55-a4d9-a076a67a50ce") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:15.029045 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:15.028931 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjl4q\" (UniqueName: \"kubernetes.io/projected/b45111e6-3682-445c-ac82-d2870a0cac78-kube-api-access-fjl4q\") pod \"network-check-target-96bs9\" (UID: \"b45111e6-3682-445c-ac82-d2870a0cac78\") " pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:57:15.029211 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:15.029097 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:57:15.029211 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:15.029118 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:57:15.029211 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:15.029130 2575 projected.go:194] Error preparing data for projected volume kube-api-access-fjl4q for pod openshift-network-diagnostics/network-check-target-96bs9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:15.029211 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:15.029182 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b45111e6-3682-445c-ac82-d2870a0cac78-kube-api-access-fjl4q podName:b45111e6-3682-445c-ac82-d2870a0cac78 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:23.029165594 +0000 UTC m=+17.207414455 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-fjl4q" (UniqueName: "kubernetes.io/projected/b45111e6-3682-445c-ac82-d2870a0cac78-kube-api-access-fjl4q") pod "network-check-target-96bs9" (UID: "b45111e6-3682-445c-ac82-d2870a0cac78") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:15.370406 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:15.370187 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85abdd4c_8c23_4bf2_b9a2_a5e83b75807a.slice/crio-4e449a226e10cb5c5a1c963ea5fa29691df4e870447f171fc5b6e5986a1157fc WatchSource:0}: Error finding container 4e449a226e10cb5c5a1c963ea5fa29691df4e870447f171fc5b6e5986a1157fc: Status 404 returned error can't find the container with id 4e449a226e10cb5c5a1c963ea5fa29691df4e870447f171fc5b6e5986a1157fc Apr 16 13:57:15.386322 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:15.386296 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:15.386611 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:15.386419 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5wxx6" podUID="13c9904e-5fed-4ef4-845b-0a77e68bc8f7" Apr 16 13:57:15.386611 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:15.386483 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:15.386611 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:15.386583 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crlsp" podUID="77d59171-3e29-4e55-a4d9-a076a67a50ce" Apr 16 13:57:15.422884 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:15.422286 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xjl2f" event={"ID":"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a","Type":"ContainerStarted","Data":"4e449a226e10cb5c5a1c963ea5fa29691df4e870447f171fc5b6e5986a1157fc"} Apr 16 13:57:15.425023 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:15.424829 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xfspm" event={"ID":"87cef1d3-c711-4f53-a775-8b55ec1bcf86","Type":"ContainerStarted","Data":"91e85fb1e4bee1c3bcfe624d730afcaa8f9df1f8eef58af5fff6a1cac2ff463f"} Apr 16 13:57:16.388002 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:16.387923 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:57:16.388189 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:16.388040 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-96bs9" podUID="b45111e6-3682-445c-ac82-d2870a0cac78" Apr 16 13:57:16.428908 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:16.428859 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" event={"ID":"7156eacb-925d-44f8-8084-e8c54e35371b","Type":"ContainerStarted","Data":"0a3b895702ce6c17c8c249ae54bc30cdab0a1eae7b90bf5af3b3b9e21f4b1739"} Apr 16 13:57:16.433427 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:16.433372 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-cr2k7" event={"ID":"fa435174-0b32-4191-b42c-ad32bd3bc5db","Type":"ContainerStarted","Data":"3e210f514d2e07a95dd9cb8bc02c628180f398666f3f3944fbb87292eca7e671"} Apr 16 13:57:16.436210 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:16.436109 2575 generic.go:358] "Generic (PLEG): container finished" podID="8bcc61cd4afd68fd131f750c02f2018e" containerID="340872a15400fc09d76ef7a02708006f50d9b90e9dcc5ea05ee37c70ee18ff00" exitCode=0 Apr 16 13:57:16.436210 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:16.436194 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-131.ec2.internal" event={"ID":"8bcc61cd4afd68fd131f750c02f2018e","Type":"ContainerDied","Data":"340872a15400fc09d76ef7a02708006f50d9b90e9dcc5ea05ee37c70ee18ff00"} Apr 16 13:57:16.439416 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:16.439392 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-131.ec2.internal" event={"ID":"d8d2889d114a061679832b8c70f242a6","Type":"ContainerStarted","Data":"405dce8036bb84699233e0ff906c72e4f9d2db23225a7b68a3d27344592d065c"} Apr 16 13:57:16.442585 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:16.442377 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hvqjp" event={"ID":"1b680da6-ab85-4c31-98d8-35be4b07624b","Type":"ContainerStarted","Data":"67ddbed830c5e5e26713751bced74ed898a4ecd73d1d4cfa82fd20953cbf45ae"} Apr 16 13:57:16.444545 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:16.444520 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-89jl9" event={"ID":"cb697396-e88e-4780-9f6a-2109bfc21e0f","Type":"ContainerStarted","Data":"f1aff3176f10534d83ba6d72123e567b1bed22995a4a5d2a98531a373eae9347"} Apr 16 13:57:16.458889 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:16.458825 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-cr2k7" podStartSLOduration=3.515700797 podStartE2EDuration="10.458809669s" podCreationTimestamp="2026-04-16 13:57:06 +0000 UTC" firstStartedPulling="2026-04-16 13:57:08.353236272 +0000 UTC m=+2.531485124" lastFinishedPulling="2026-04-16 13:57:15.296345136 +0000 UTC m=+9.474593996" observedRunningTime="2026-04-16 13:57:16.447616153 +0000 UTC m=+10.625865022" watchObservedRunningTime="2026-04-16 13:57:16.458809669 +0000 UTC m=+10.637058537" Apr 16 13:57:16.459540 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:16.459284 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-131.ec2.internal" podStartSLOduration=9.459277072999999 podStartE2EDuration="9.459277073s" podCreationTimestamp="2026-04-16 13:57:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:57:16.458555272 +0000 UTC m=+10.636804140" watchObservedRunningTime="2026-04-16 13:57:16.459277073 +0000 UTC m=+10.637525943" Apr 16 13:57:16.500112 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:16.500051 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hvqjp" podStartSLOduration=5.109094315 podStartE2EDuration="10.50003052s" podCreationTimestamp="2026-04-16 13:57:06 +0000 UTC" firstStartedPulling="2026-04-16 13:57:09.986388174 +0000 UTC m=+4.164637033" lastFinishedPulling="2026-04-16 13:57:15.377324378 +0000 UTC m=+9.555573238" observedRunningTime="2026-04-16 13:57:16.487964532 +0000 UTC m=+10.666213401" watchObservedRunningTime="2026-04-16 13:57:16.50003052 +0000 UTC m=+10.678279391" Apr 16 13:57:17.386373 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:17.386284 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:17.386536 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:17.386416 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5wxx6" podUID="13c9904e-5fed-4ef4-845b-0a77e68bc8f7" Apr 16 13:57:17.386942 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:17.386797 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:17.386942 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:17.386902 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crlsp" podUID="77d59171-3e29-4e55-a4d9-a076a67a50ce" Apr 16 13:57:17.448767 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:17.448319 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5kvpq" event={"ID":"0a0fcf8a-7d68-4b75-b145-75ba1622662d","Type":"ContainerStarted","Data":"fb73975a5f1e4216e1c1150b45b9fc88c86fd7f0e83fc4c1a011537081b9cd1a"} Apr 16 13:57:17.462318 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:17.461543 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-89jl9" podStartSLOduration=4.161410032 podStartE2EDuration="11.461523672s" podCreationTimestamp="2026-04-16 13:57:06 +0000 UTC" firstStartedPulling="2026-04-16 13:57:08.039201685 +0000 UTC m=+2.217450531" lastFinishedPulling="2026-04-16 13:57:15.339315325 +0000 UTC m=+9.517564171" observedRunningTime="2026-04-16 13:57:16.500858468 +0000 UTC m=+10.679107338" watchObservedRunningTime="2026-04-16 13:57:17.461523672 +0000 UTC m=+11.639772540" Apr 16 13:57:17.749370 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:17.749284 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-cr2k7" Apr 16 13:57:17.750016 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:17.749992 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-cr2k7" Apr 16 13:57:17.765545 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:17.765367 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-5kvpq" podStartSLOduration=5.186607831 podStartE2EDuration="11.765353687s" podCreationTimestamp="2026-04-16 13:57:06 +0000 UTC" firstStartedPulling="2026-04-16 13:57:08.767398374 +0000 UTC m=+2.945647224" lastFinishedPulling="2026-04-16 13:57:15.346144229 +0000 UTC m=+9.524393080" observedRunningTime="2026-04-16 13:57:17.462202068 +0000 UTC m=+11.640450937" watchObservedRunningTime="2026-04-16 13:57:17.765353687 +0000 UTC m=+11.943602556" Apr 16 13:57:18.386553 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:18.386513 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:57:18.386736 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:18.386631 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-96bs9" podUID="b45111e6-3682-445c-ac82-d2870a0cac78" Apr 16 13:57:18.758426 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:18.758329 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/13c9904e-5fed-4ef4-845b-0a77e68bc8f7-original-pull-secret\") pod \"global-pull-secret-syncer-5wxx6\" (UID: \"13c9904e-5fed-4ef4-845b-0a77e68bc8f7\") " pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:18.758882 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:18.758466 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:57:18.758882 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:18.758522 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13c9904e-5fed-4ef4-845b-0a77e68bc8f7-original-pull-secret podName:13c9904e-5fed-4ef4-845b-0a77e68bc8f7 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:26.758508573 +0000 UTC m=+20.936757424 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/13c9904e-5fed-4ef4-845b-0a77e68bc8f7-original-pull-secret") pod "global-pull-secret-syncer-5wxx6" (UID: "13c9904e-5fed-4ef4-845b-0a77e68bc8f7") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:57:19.386406 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:19.386366 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:19.386592 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:19.386371 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:19.386592 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:19.386501 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5wxx6" podUID="13c9904e-5fed-4ef4-845b-0a77e68bc8f7" Apr 16 13:57:19.386690 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:19.386584 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crlsp" podUID="77d59171-3e29-4e55-a4d9-a076a67a50ce" Apr 16 13:57:19.450988 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:19.450958 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 13:57:20.387141 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:20.386948 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:57:20.387600 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:20.387252 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-96bs9" podUID="b45111e6-3682-445c-ac82-d2870a0cac78" Apr 16 13:57:21.386709 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:21.386672 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:21.386945 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:21.386672 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:21.386945 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:21.386819 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5wxx6" podUID="13c9904e-5fed-4ef4-845b-0a77e68bc8f7" Apr 16 13:57:21.386945 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:21.386900 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crlsp" podUID="77d59171-3e29-4e55-a4d9-a076a67a50ce" Apr 16 13:57:22.386270 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:22.386219 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:57:22.386870 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:22.386380 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-96bs9" podUID="b45111e6-3682-445c-ac82-d2870a0cac78" Apr 16 13:57:22.731463 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:22.731357 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-cr2k7" Apr 16 13:57:22.731622 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:22.731522 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 13:57:22.732186 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:22.732159 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-cr2k7" Apr 16 13:57:22.990961 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:22.990875 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs\") pod \"network-metrics-daemon-crlsp\" (UID: \"77d59171-3e29-4e55-a4d9-a076a67a50ce\") " pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:22.991129 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:22.991043 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:22.991129 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:22.991122 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs podName:77d59171-3e29-4e55-a4d9-a076a67a50ce nodeName:}" failed. No retries permitted until 2026-04-16 13:57:38.991103449 +0000 UTC m=+33.169352309 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs") pod "network-metrics-daemon-crlsp" (UID: "77d59171-3e29-4e55-a4d9-a076a67a50ce") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:23.092031 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:23.091990 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjl4q\" (UniqueName: \"kubernetes.io/projected/b45111e6-3682-445c-ac82-d2870a0cac78-kube-api-access-fjl4q\") pod \"network-check-target-96bs9\" (UID: \"b45111e6-3682-445c-ac82-d2870a0cac78\") " pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:57:23.092207 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:23.092183 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:57:23.092266 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:23.092210 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:57:23.092266 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:23.092223 2575 projected.go:194] Error preparing data for projected volume kube-api-access-fjl4q for pod openshift-network-diagnostics/network-check-target-96bs9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:23.092357 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:23.092288 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b45111e6-3682-445c-ac82-d2870a0cac78-kube-api-access-fjl4q podName:b45111e6-3682-445c-ac82-d2870a0cac78 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:39.0922683 +0000 UTC m=+33.270517154 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-fjl4q" (UniqueName: "kubernetes.io/projected/b45111e6-3682-445c-ac82-d2870a0cac78-kube-api-access-fjl4q") pod "network-check-target-96bs9" (UID: "b45111e6-3682-445c-ac82-d2870a0cac78") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:23.386568 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:23.386489 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:23.387029 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:23.386494 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:23.387029 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:23.386590 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5wxx6" podUID="13c9904e-5fed-4ef4-845b-0a77e68bc8f7" Apr 16 13:57:23.387029 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:23.386708 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crlsp" podUID="77d59171-3e29-4e55-a4d9-a076a67a50ce" Apr 16 13:57:24.386938 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:24.386898 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:57:24.387460 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:24.387039 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-96bs9" podUID="b45111e6-3682-445c-ac82-d2870a0cac78" Apr 16 13:57:25.386350 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:25.386306 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:25.386540 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:25.386306 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:25.386540 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:25.386450 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crlsp" podUID="77d59171-3e29-4e55-a4d9-a076a67a50ce" Apr 16 13:57:25.386654 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:25.386541 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5wxx6" podUID="13c9904e-5fed-4ef4-845b-0a77e68bc8f7" Apr 16 13:57:26.387914 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:26.387874 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:57:26.388478 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:26.388224 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-96bs9" podUID="b45111e6-3682-445c-ac82-d2870a0cac78" Apr 16 13:57:26.820251 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:26.820176 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/13c9904e-5fed-4ef4-845b-0a77e68bc8f7-original-pull-secret\") pod \"global-pull-secret-syncer-5wxx6\" (UID: \"13c9904e-5fed-4ef4-845b-0a77e68bc8f7\") " pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:26.820411 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:26.820297 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:57:26.820411 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:26.820360 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13c9904e-5fed-4ef4-845b-0a77e68bc8f7-original-pull-secret podName:13c9904e-5fed-4ef4-845b-0a77e68bc8f7 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:42.820340516 +0000 UTC m=+36.998589367 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/13c9904e-5fed-4ef4-845b-0a77e68bc8f7-original-pull-secret") pod "global-pull-secret-syncer-5wxx6" (UID: "13c9904e-5fed-4ef4-845b-0a77e68bc8f7") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:57:27.386209 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:27.385860 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:27.386209 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:27.385911 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:27.386209 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:27.386007 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5wxx6" podUID="13c9904e-5fed-4ef4-845b-0a77e68bc8f7" Apr 16 13:57:27.386476 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:27.386452 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crlsp" podUID="77d59171-3e29-4e55-a4d9-a076a67a50ce" Apr 16 13:57:27.472657 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:27.470689 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mmczn" event={"ID":"b3cb1a86-beca-4c98-9d1a-b08d033e57ac","Type":"ContainerStarted","Data":"a5ba354c3d7e50c381d29638e04a4ea52c779ac027e7538d0d7933a0fe498573"} Apr 16 13:57:27.479461 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:27.478004 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xfspm" event={"ID":"87cef1d3-c711-4f53-a775-8b55ec1bcf86","Type":"ContainerStarted","Data":"6a2a1404b066d3e7a2dd191c04c98f1ad4f6655bfe308d94a64e01630ef1082d"} Apr 16 13:57:27.487442 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:27.486811 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mmczn" podStartSLOduration=2.791951455 podStartE2EDuration="21.486795041s" podCreationTimestamp="2026-04-16 13:57:06 +0000 UTC" firstStartedPulling="2026-04-16 13:57:08.6485601 +0000 UTC m=+2.826808953" lastFinishedPulling="2026-04-16 13:57:27.343403675 +0000 UTC m=+21.521652539" observedRunningTime="2026-04-16 13:57:27.486258872 +0000 UTC m=+21.664507740" watchObservedRunningTime="2026-04-16 13:57:27.486795041 +0000 UTC m=+21.665043910" Apr 16 13:57:27.506949 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:27.506352 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-xfspm" podStartSLOduration=9.523486557 podStartE2EDuration="21.506334611s" podCreationTimestamp="2026-04-16 13:57:06 +0000 UTC" firstStartedPulling="2026-04-16 13:57:15.361135432 +0000 UTC m=+9.539384284" lastFinishedPulling="2026-04-16 13:57:27.343983485 +0000 UTC m=+21.522232338" observedRunningTime="2026-04-16 13:57:27.505568076 +0000 UTC m=+21.683817144" watchObservedRunningTime="2026-04-16 13:57:27.506334611 +0000 UTC m=+21.684583480" Apr 16 13:57:27.588967 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:27.588934 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 13:57:28.331500 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:28.331204 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T13:57:27.588953666Z","UUID":"17512ce5-1029-4f44-908d-cd08f4146788","Handler":null,"Name":"","Endpoint":""} Apr 16 13:57:28.334583 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:28.334554 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 13:57:28.334583 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:28.334588 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 13:57:28.386860 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:28.386822 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:57:28.387035 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:28.386953 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-96bs9" podUID="b45111e6-3682-445c-ac82-d2870a0cac78" Apr 16 13:57:28.481403 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:28.481368 2575 generic.go:358] "Generic (PLEG): container finished" podID="85abdd4c-8c23-4bf2-b9a2-a5e83b75807a" containerID="ff6ac65fcdbc4f0aa962dc8b49307f68ee34e1550dbcfa62bcdbdfab3ffb04a1" exitCode=0 Apr 16 13:57:28.481918 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:28.481459 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xjl2f" event={"ID":"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a","Type":"ContainerDied","Data":"ff6ac65fcdbc4f0aa962dc8b49307f68ee34e1550dbcfa62bcdbdfab3ffb04a1"} Apr 16 13:57:28.484650 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:28.484624 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" event={"ID":"04113a02-0dc7-42c8-a11b-4684fb794c4f","Type":"ContainerStarted","Data":"89a1b2931c9003506ca96a1b0f8a8435da03c4c14fce9eb75fb92604ad39f186"} Apr 16 13:57:28.484766 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:28.484663 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" event={"ID":"04113a02-0dc7-42c8-a11b-4684fb794c4f","Type":"ContainerStarted","Data":"506ef2065fef240e1e5f977f04259301709719ba9d0d6d3db3786013181e1940"} Apr 16 13:57:28.484766 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:28.484677 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" event={"ID":"04113a02-0dc7-42c8-a11b-4684fb794c4f","Type":"ContainerStarted","Data":"7cb2aadb45d5a54289e9fd82bf2662dbee562d89bde68db4a355906e728b3ac7"} Apr 16 13:57:28.484766 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:28.484689 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" event={"ID":"04113a02-0dc7-42c8-a11b-4684fb794c4f","Type":"ContainerStarted","Data":"0ca1ca7e2f047610b14fa373b09b13463f929743ac16c4bdedd629ed6ab64cf3"} Apr 16 13:57:28.484766 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:28.484701 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" event={"ID":"04113a02-0dc7-42c8-a11b-4684fb794c4f","Type":"ContainerStarted","Data":"4ea1ae6b6eb90476852bf1a34d8b75e40a7ab3fd54c060ff6a052f2e79653390"} Apr 16 13:57:28.484766 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:28.484713 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" event={"ID":"04113a02-0dc7-42c8-a11b-4684fb794c4f","Type":"ContainerStarted","Data":"c3902072d2a509f3337af93a952f52c3c6cbf2acdeffa3e8917e8f6fecb18fcc"} Apr 16 13:57:28.486427 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:28.486406 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" event={"ID":"7156eacb-925d-44f8-8084-e8c54e35371b","Type":"ContainerStarted","Data":"eb3d0bd075ffc6957732970c6630c022f51b5405a0567a0d515cbf6246ec42d1"} Apr 16 13:57:28.488175 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:28.488152 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-131.ec2.internal" event={"ID":"8bcc61cd4afd68fd131f750c02f2018e","Type":"ContainerStarted","Data":"7774e76de3a352644e976f6d04f9551c64efdc46bb1f2bcde0c61e467bf8b60f"} Apr 16 13:57:28.526136 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:28.526081 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-131.ec2.internal" podStartSLOduration=21.526064853 podStartE2EDuration="21.526064853s" podCreationTimestamp="2026-04-16 13:57:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:57:28.525874633 +0000 UTC m=+22.704123502" watchObservedRunningTime="2026-04-16 13:57:28.526064853 +0000 UTC m=+22.704313720" Apr 16 13:57:29.386283 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:29.386252 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:29.386447 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:29.386254 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:29.386447 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:29.386347 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5wxx6" podUID="13c9904e-5fed-4ef4-845b-0a77e68bc8f7" Apr 16 13:57:29.386447 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:29.386431 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crlsp" podUID="77d59171-3e29-4e55-a4d9-a076a67a50ce" Apr 16 13:57:29.492591 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:29.492546 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" event={"ID":"7156eacb-925d-44f8-8084-e8c54e35371b","Type":"ContainerStarted","Data":"21c8f2fe84fd78f9b3f9b1192fe1e4705de69c4f52b9b2b2c639c5d2f6f975bb"} Apr 16 13:57:29.511456 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:29.511392 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hl7jg" podStartSLOduration=3.540064688 podStartE2EDuration="23.511377516s" podCreationTimestamp="2026-04-16 13:57:06 +0000 UTC" firstStartedPulling="2026-04-16 13:57:08.678317366 +0000 UTC m=+2.856566217" lastFinishedPulling="2026-04-16 13:57:28.6496302 +0000 UTC m=+22.827879045" observedRunningTime="2026-04-16 13:57:29.510972458 +0000 UTC m=+23.689221325" watchObservedRunningTime="2026-04-16 13:57:29.511377516 +0000 UTC m=+23.689626396" Apr 16 13:57:30.386425 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:30.386390 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:57:30.386600 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:30.386507 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-96bs9" podUID="b45111e6-3682-445c-ac82-d2870a0cac78" Apr 16 13:57:30.498329 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:30.498282 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" event={"ID":"04113a02-0dc7-42c8-a11b-4684fb794c4f","Type":"ContainerStarted","Data":"6273a0ac8b50e8b15813bff2b2533829f47d409cef6b1e66a7da4240ace679e0"} Apr 16 13:57:31.386662 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:31.386626 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:31.386662 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:31.386648 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:31.386883 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:31.386783 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crlsp" podUID="77d59171-3e29-4e55-a4d9-a076a67a50ce" Apr 16 13:57:31.386929 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:31.386905 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5wxx6" podUID="13c9904e-5fed-4ef4-845b-0a77e68bc8f7" Apr 16 13:57:32.386434 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:32.386401 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:57:32.387019 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:32.386516 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-96bs9" podUID="b45111e6-3682-445c-ac82-d2870a0cac78" Apr 16 13:57:33.386592 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:33.386417 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:33.387059 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:33.386418 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:33.387059 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:33.386689 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5wxx6" podUID="13c9904e-5fed-4ef4-845b-0a77e68bc8f7" Apr 16 13:57:33.387059 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:33.386772 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crlsp" podUID="77d59171-3e29-4e55-a4d9-a076a67a50ce" Apr 16 13:57:33.505308 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:33.505206 2575 generic.go:358] "Generic (PLEG): container finished" podID="85abdd4c-8c23-4bf2-b9a2-a5e83b75807a" containerID="8303c430d5bda1332f483c5cd7a9ae705134c85a60c53704837545cbfc3edade" exitCode=0 Apr 16 13:57:33.505462 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:33.505293 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xjl2f" event={"ID":"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a","Type":"ContainerDied","Data":"8303c430d5bda1332f483c5cd7a9ae705134c85a60c53704837545cbfc3edade"} Apr 16 13:57:33.508401 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:33.508376 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" event={"ID":"04113a02-0dc7-42c8-a11b-4684fb794c4f","Type":"ContainerStarted","Data":"dfde36ec16e72cf0a54926c78e41e8b170f3f4c0bb8e180100589eb6b2025bc4"} Apr 16 13:57:33.508769 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:33.508747 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:33.523325 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:33.523298 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:33.549644 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:33.549574 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" podStartSLOduration=8.244639945 podStartE2EDuration="27.549558322s" podCreationTimestamp="2026-04-16 13:57:06 +0000 UTC" firstStartedPulling="2026-04-16 13:57:08.039309807 +0000 UTC m=+2.217558652" lastFinishedPulling="2026-04-16 13:57:27.344228168 +0000 UTC m=+21.522477029" observedRunningTime="2026-04-16 13:57:33.549224569 +0000 UTC m=+27.727473453" watchObservedRunningTime="2026-04-16 13:57:33.549558322 +0000 UTC m=+27.727807190" Apr 16 13:57:34.386189 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:34.386161 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:57:34.386417 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:34.386264 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-96bs9" podUID="b45111e6-3682-445c-ac82-d2870a0cac78" Apr 16 13:57:34.511998 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:34.511835 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:34.511998 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:34.511967 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:34.527454 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:34.527430 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:57:34.902457 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:34.902409 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5wxx6"] Apr 16 13:57:34.902620 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:34.902543 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:34.902681 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:34.902648 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5wxx6" podUID="13c9904e-5fed-4ef4-845b-0a77e68bc8f7" Apr 16 13:57:34.905896 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:34.905866 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-crlsp"] Apr 16 13:57:34.906013 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:34.905996 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:34.906129 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:34.906108 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crlsp" podUID="77d59171-3e29-4e55-a4d9-a076a67a50ce" Apr 16 13:57:34.906414 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:34.906393 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-96bs9"] Apr 16 13:57:34.906520 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:34.906509 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:57:34.906620 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:34.906578 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-96bs9" podUID="b45111e6-3682-445c-ac82-d2870a0cac78" Apr 16 13:57:35.514312 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:35.514280 2575 generic.go:358] "Generic (PLEG): container finished" podID="85abdd4c-8c23-4bf2-b9a2-a5e83b75807a" containerID="6cf735d211defbcb7626d2d96d364427c812c020527ac9edf101cbbed42f3d2d" exitCode=0 Apr 16 13:57:35.514677 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:35.514349 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xjl2f" event={"ID":"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a","Type":"ContainerDied","Data":"6cf735d211defbcb7626d2d96d364427c812c020527ac9edf101cbbed42f3d2d"} Apr 16 13:57:36.387404 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:36.387370 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:36.387559 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:36.387421 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:57:36.387559 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:36.387503 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crlsp" podUID="77d59171-3e29-4e55-a4d9-a076a67a50ce" Apr 16 13:57:36.387650 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:36.387580 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-96bs9" podUID="b45111e6-3682-445c-ac82-d2870a0cac78" Apr 16 13:57:37.386195 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:37.386039 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:37.386572 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:37.386274 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5wxx6" podUID="13c9904e-5fed-4ef4-845b-0a77e68bc8f7" Apr 16 13:57:37.519902 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:37.519864 2575 generic.go:358] "Generic (PLEG): container finished" podID="85abdd4c-8c23-4bf2-b9a2-a5e83b75807a" containerID="282b6dbc8c21d82cc2081ade3a760aaecb3b909fbd00ec09242f7bb8bb554129" exitCode=0 Apr 16 13:57:37.519902 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:37.519903 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xjl2f" event={"ID":"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a","Type":"ContainerDied","Data":"282b6dbc8c21d82cc2081ade3a760aaecb3b909fbd00ec09242f7bb8bb554129"} Apr 16 13:57:38.386097 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:38.386061 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:57:38.386097 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:38.386096 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:38.386613 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:38.386190 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-96bs9" podUID="b45111e6-3682-445c-ac82-d2870a0cac78" Apr 16 13:57:38.386613 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:38.386286 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-crlsp" podUID="77d59171-3e29-4e55-a4d9-a076a67a50ce" Apr 16 13:57:39.015736 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.015680 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs\") pod \"network-metrics-daemon-crlsp\" (UID: \"77d59171-3e29-4e55-a4d9-a076a67a50ce\") " pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:39.015931 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:39.015831 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:39.015931 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:39.015899 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs podName:77d59171-3e29-4e55-a4d9-a076a67a50ce nodeName:}" failed. No retries permitted until 2026-04-16 13:58:11.015881462 +0000 UTC m=+65.194130312 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs") pod "network-metrics-daemon-crlsp" (UID: "77d59171-3e29-4e55-a4d9-a076a67a50ce") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:39.116382 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.116342 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjl4q\" (UniqueName: \"kubernetes.io/projected/b45111e6-3682-445c-ac82-d2870a0cac78-kube-api-access-fjl4q\") pod \"network-check-target-96bs9\" (UID: \"b45111e6-3682-445c-ac82-d2870a0cac78\") " pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:57:39.116573 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:39.116509 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:57:39.116573 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:39.116532 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:57:39.116573 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:39.116544 2575 projected.go:194] Error preparing data for projected volume kube-api-access-fjl4q for pod openshift-network-diagnostics/network-check-target-96bs9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:39.116689 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:39.116607 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b45111e6-3682-445c-ac82-d2870a0cac78-kube-api-access-fjl4q podName:b45111e6-3682-445c-ac82-d2870a0cac78 nodeName:}" failed. No retries permitted until 2026-04-16 13:58:11.116589296 +0000 UTC m=+65.294838147 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-fjl4q" (UniqueName: "kubernetes.io/projected/b45111e6-3682-445c-ac82-d2870a0cac78-kube-api-access-fjl4q") pod "network-check-target-96bs9" (UID: "b45111e6-3682-445c-ac82-d2870a0cac78") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:39.193266 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.193236 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-131.ec2.internal" event="NodeReady" Apr 16 13:57:39.193452 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.193436 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 13:57:39.228413 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.228374 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6b9d68db7f-fjmrf"] Apr 16 13:57:39.237014 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.236985 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:39.239949 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.239918 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-qt6qg\"" Apr 16 13:57:39.240455 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.240436 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 13:57:39.240561 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.240531 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 13:57:39.240561 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.240531 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 13:57:39.243571 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.243544 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6b9d68db7f-fjmrf"] Apr 16 13:57:39.245598 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.245286 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-g5lb6"] Apr 16 13:57:39.252262 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.252236 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 13:57:39.252763 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.252714 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-hgrw6"] Apr 16 13:57:39.252900 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.252882 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g5lb6" Apr 16 13:57:39.254999 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.254977 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 13:57:39.255110 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.255008 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 13:57:39.255175 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.255152 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ftzrr\"" Apr 16 13:57:39.259493 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.259467 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-g5lb6"] Apr 16 13:57:39.259659 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.259644 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hgrw6" Apr 16 13:57:39.262469 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.262251 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 13:57:39.262469 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.262288 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mv45m\"" Apr 16 13:57:39.262469 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.262318 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 13:57:39.262469 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.262335 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 13:57:39.262469 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.262434 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hgrw6"] Apr 16 13:57:39.317652 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.317603 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f6503505-8fc2-4e22-b559-ade573fe4d03-installation-pull-secrets\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:39.317973 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.317663 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-bound-sa-token\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:39.317973 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.317752 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:39.317973 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.317817 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-certificates\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:39.317973 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.317847 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6503505-8fc2-4e22-b559-ade573fe4d03-trusted-ca\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:39.317973 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.317882 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f6503505-8fc2-4e22-b559-ade573fe4d03-image-registry-private-configuration\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:39.317973 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.317948 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f6503505-8fc2-4e22-b559-ade573fe4d03-ca-trust-extracted\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:39.318188 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.317984 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stsnk\" (UniqueName: \"kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-kube-api-access-stsnk\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:39.386372 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.386297 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:39.388896 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.388872 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 13:57:39.418542 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.418504 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f6503505-8fc2-4e22-b559-ade573fe4d03-installation-pull-secrets\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:39.418739 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.418549 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-bound-sa-token\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:39.418739 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.418584 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/92799234-6fff-45be-a27c-c70096483d30-tmp-dir\") pod \"dns-default-g5lb6\" (UID: \"92799234-6fff-45be-a27c-c70096483d30\") " pod="openshift-dns/dns-default-g5lb6" Apr 16 13:57:39.418739 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.418613 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:39.418739 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.418649 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69nfj\" (UniqueName: \"kubernetes.io/projected/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-kube-api-access-69nfj\") pod \"ingress-canary-hgrw6\" (UID: \"b77dfd1a-6bd6-449b-8db3-c93ff41eb18e\") " pod="openshift-ingress-canary/ingress-canary-hgrw6" Apr 16 13:57:39.418739 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.418681 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcc2l\" (UniqueName: \"kubernetes.io/projected/92799234-6fff-45be-a27c-c70096483d30-kube-api-access-qcc2l\") pod \"dns-default-g5lb6\" (UID: \"92799234-6fff-45be-a27c-c70096483d30\") " pod="openshift-dns/dns-default-g5lb6" Apr 16 13:57:39.418739 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.418708 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-certificates\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:39.419044 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.418749 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6503505-8fc2-4e22-b559-ade573fe4d03-trusted-ca\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:39.419044 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:39.418754 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:57:39.419044 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:39.418773 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b9d68db7f-fjmrf: secret "image-registry-tls" not found Apr 16 13:57:39.419044 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:39.418831 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls podName:f6503505-8fc2-4e22-b559-ade573fe4d03 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:39.918810089 +0000 UTC m=+34.097058951 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls") pod "image-registry-6b9d68db7f-fjmrf" (UID: "f6503505-8fc2-4e22-b559-ade573fe4d03") : secret "image-registry-tls" not found Apr 16 13:57:39.419044 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.418908 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92799234-6fff-45be-a27c-c70096483d30-config-volume\") pod \"dns-default-g5lb6\" (UID: \"92799234-6fff-45be-a27c-c70096483d30\") " pod="openshift-dns/dns-default-g5lb6" Apr 16 13:57:39.419044 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.418937 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls\") pod \"dns-default-g5lb6\" (UID: \"92799234-6fff-45be-a27c-c70096483d30\") " pod="openshift-dns/dns-default-g5lb6" Apr 16 13:57:39.419044 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.418981 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f6503505-8fc2-4e22-b559-ade573fe4d03-image-registry-private-configuration\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:39.419044 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.419035 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert\") pod \"ingress-canary-hgrw6\" (UID: \"b77dfd1a-6bd6-449b-8db3-c93ff41eb18e\") " pod="openshift-ingress-canary/ingress-canary-hgrw6" Apr 16 13:57:39.419416 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.419082 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f6503505-8fc2-4e22-b559-ade573fe4d03-ca-trust-extracted\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:39.419416 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.419127 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stsnk\" (UniqueName: \"kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-kube-api-access-stsnk\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:39.419416 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.419373 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-certificates\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:39.419631 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.419602 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f6503505-8fc2-4e22-b559-ade573fe4d03-ca-trust-extracted\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:39.419783 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.419764 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6503505-8fc2-4e22-b559-ade573fe4d03-trusted-ca\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:39.423160 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.423140 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f6503505-8fc2-4e22-b559-ade573fe4d03-installation-pull-secrets\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:39.423280 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.423142 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f6503505-8fc2-4e22-b559-ade573fe4d03-image-registry-private-configuration\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:39.430305 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.430258 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-bound-sa-token\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:39.432620 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.432596 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stsnk\" (UniqueName: \"kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-kube-api-access-stsnk\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:39.521005 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.520188 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert\") pod \"ingress-canary-hgrw6\" (UID: \"b77dfd1a-6bd6-449b-8db3-c93ff41eb18e\") " pod="openshift-ingress-canary/ingress-canary-hgrw6" Apr 16 13:57:39.521005 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.520289 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/92799234-6fff-45be-a27c-c70096483d30-tmp-dir\") pod \"dns-default-g5lb6\" (UID: \"92799234-6fff-45be-a27c-c70096483d30\") " pod="openshift-dns/dns-default-g5lb6" Apr 16 13:57:39.521005 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:39.520332 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:57:39.521005 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:39.520417 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert podName:b77dfd1a-6bd6-449b-8db3-c93ff41eb18e nodeName:}" failed. No retries permitted until 2026-04-16 13:57:40.020395612 +0000 UTC m=+34.198644477 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert") pod "ingress-canary-hgrw6" (UID: "b77dfd1a-6bd6-449b-8db3-c93ff41eb18e") : secret "canary-serving-cert" not found Apr 16 13:57:39.521005 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.520342 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69nfj\" (UniqueName: \"kubernetes.io/projected/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-kube-api-access-69nfj\") pod \"ingress-canary-hgrw6\" (UID: \"b77dfd1a-6bd6-449b-8db3-c93ff41eb18e\") " pod="openshift-ingress-canary/ingress-canary-hgrw6" Apr 16 13:57:39.521005 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.520560 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qcc2l\" (UniqueName: \"kubernetes.io/projected/92799234-6fff-45be-a27c-c70096483d30-kube-api-access-qcc2l\") pod \"dns-default-g5lb6\" (UID: \"92799234-6fff-45be-a27c-c70096483d30\") " pod="openshift-dns/dns-default-g5lb6" Apr 16 13:57:39.521005 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.520603 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92799234-6fff-45be-a27c-c70096483d30-config-volume\") pod \"dns-default-g5lb6\" (UID: \"92799234-6fff-45be-a27c-c70096483d30\") " pod="openshift-dns/dns-default-g5lb6" Apr 16 13:57:39.521005 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.520631 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls\") pod \"dns-default-g5lb6\" (UID: \"92799234-6fff-45be-a27c-c70096483d30\") " pod="openshift-dns/dns-default-g5lb6" Apr 16 13:57:39.521005 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.520632 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/92799234-6fff-45be-a27c-c70096483d30-tmp-dir\") pod \"dns-default-g5lb6\" (UID: \"92799234-6fff-45be-a27c-c70096483d30\") " pod="openshift-dns/dns-default-g5lb6" Apr 16 13:57:39.521005 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:39.520715 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:57:39.521005 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:39.520783 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls podName:92799234-6fff-45be-a27c-c70096483d30 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:40.02076985 +0000 UTC m=+34.199018719 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls") pod "dns-default-g5lb6" (UID: "92799234-6fff-45be-a27c-c70096483d30") : secret "dns-default-metrics-tls" not found Apr 16 13:57:39.521639 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.521271 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92799234-6fff-45be-a27c-c70096483d30-config-volume\") pod \"dns-default-g5lb6\" (UID: \"92799234-6fff-45be-a27c-c70096483d30\") " pod="openshift-dns/dns-default-g5lb6" Apr 16 13:57:39.531296 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.531259 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69nfj\" (UniqueName: \"kubernetes.io/projected/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-kube-api-access-69nfj\") pod \"ingress-canary-hgrw6\" (UID: \"b77dfd1a-6bd6-449b-8db3-c93ff41eb18e\") " pod="openshift-ingress-canary/ingress-canary-hgrw6" Apr 16 13:57:39.531426 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.531275 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcc2l\" (UniqueName: \"kubernetes.io/projected/92799234-6fff-45be-a27c-c70096483d30-kube-api-access-qcc2l\") pod \"dns-default-g5lb6\" (UID: \"92799234-6fff-45be-a27c-c70096483d30\") " pod="openshift-dns/dns-default-g5lb6" Apr 16 13:57:39.923740 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:39.923688 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:39.923927 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:39.923854 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:57:39.923927 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:39.923875 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b9d68db7f-fjmrf: secret "image-registry-tls" not found Apr 16 13:57:39.923999 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:39.923941 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls podName:f6503505-8fc2-4e22-b559-ade573fe4d03 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:40.9239169 +0000 UTC m=+35.102165754 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls") pod "image-registry-6b9d68db7f-fjmrf" (UID: "f6503505-8fc2-4e22-b559-ade573fe4d03") : secret "image-registry-tls" not found Apr 16 13:57:40.024888 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:40.024829 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert\") pod \"ingress-canary-hgrw6\" (UID: \"b77dfd1a-6bd6-449b-8db3-c93ff41eb18e\") " pod="openshift-ingress-canary/ingress-canary-hgrw6" Apr 16 13:57:40.025043 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:40.024989 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls\") pod \"dns-default-g5lb6\" (UID: \"92799234-6fff-45be-a27c-c70096483d30\") " pod="openshift-dns/dns-default-g5lb6" Apr 16 13:57:40.025043 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:40.024995 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:57:40.025131 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:40.025058 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert podName:b77dfd1a-6bd6-449b-8db3-c93ff41eb18e nodeName:}" failed. No retries permitted until 2026-04-16 13:57:41.025039554 +0000 UTC m=+35.203288400 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert") pod "ingress-canary-hgrw6" (UID: "b77dfd1a-6bd6-449b-8db3-c93ff41eb18e") : secret "canary-serving-cert" not found Apr 16 13:57:40.025131 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:40.025104 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:57:40.025217 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:40.025156 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls podName:92799234-6fff-45be-a27c-c70096483d30 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:41.025142677 +0000 UTC m=+35.203391535 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls") pod "dns-default-g5lb6" (UID: "92799234-6fff-45be-a27c-c70096483d30") : secret "dns-default-metrics-tls" not found Apr 16 13:57:40.386305 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:40.386272 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:57:40.386568 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:40.386531 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:57:40.389025 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:40.389001 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 13:57:40.389159 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:40.389043 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 13:57:40.389159 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:40.389065 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 13:57:40.389159 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:40.389095 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-7n6t4\"" Apr 16 13:57:40.389308 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:40.389211 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jz268\"" Apr 16 13:57:40.932156 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:40.932121 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:40.932345 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:40.932307 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:57:40.932345 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:40.932329 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b9d68db7f-fjmrf: secret "image-registry-tls" not found Apr 16 13:57:40.932461 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:40.932402 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls podName:f6503505-8fc2-4e22-b559-ade573fe4d03 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:42.932379968 +0000 UTC m=+37.110628832 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls") pod "image-registry-6b9d68db7f-fjmrf" (UID: "f6503505-8fc2-4e22-b559-ade573fe4d03") : secret "image-registry-tls" not found Apr 16 13:57:41.033067 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:41.033036 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls\") pod \"dns-default-g5lb6\" (UID: \"92799234-6fff-45be-a27c-c70096483d30\") " pod="openshift-dns/dns-default-g5lb6" Apr 16 13:57:41.033067 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:41.033079 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert\") pod \"ingress-canary-hgrw6\" (UID: \"b77dfd1a-6bd6-449b-8db3-c93ff41eb18e\") " pod="openshift-ingress-canary/ingress-canary-hgrw6" Apr 16 13:57:41.033256 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:41.033177 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:57:41.033256 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:41.033181 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:57:41.033256 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:41.033225 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert podName:b77dfd1a-6bd6-449b-8db3-c93ff41eb18e nodeName:}" failed. No retries permitted until 2026-04-16 13:57:43.033211543 +0000 UTC m=+37.211460392 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert") pod "ingress-canary-hgrw6" (UID: "b77dfd1a-6bd6-449b-8db3-c93ff41eb18e") : secret "canary-serving-cert" not found Apr 16 13:57:41.033256 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:41.033237 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls podName:92799234-6fff-45be-a27c-c70096483d30 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:43.03323205 +0000 UTC m=+37.211480896 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls") pod "dns-default-g5lb6" (UID: "92799234-6fff-45be-a27c-c70096483d30") : secret "dns-default-metrics-tls" not found Apr 16 13:57:42.847868 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:42.847821 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/13c9904e-5fed-4ef4-845b-0a77e68bc8f7-original-pull-secret\") pod \"global-pull-secret-syncer-5wxx6\" (UID: \"13c9904e-5fed-4ef4-845b-0a77e68bc8f7\") " pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:42.850758 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:42.850707 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/13c9904e-5fed-4ef4-845b-0a77e68bc8f7-original-pull-secret\") pod \"global-pull-secret-syncer-5wxx6\" (UID: \"13c9904e-5fed-4ef4-845b-0a77e68bc8f7\") " pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:42.948495 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:42.948449 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:42.948758 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:42.948621 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:57:42.948758 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:42.948642 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b9d68db7f-fjmrf: secret "image-registry-tls" not found Apr 16 13:57:42.948758 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:42.948708 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls podName:f6503505-8fc2-4e22-b559-ade573fe4d03 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:46.948688378 +0000 UTC m=+41.126937239 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls") pod "image-registry-6b9d68db7f-fjmrf" (UID: "f6503505-8fc2-4e22-b559-ade573fe4d03") : secret "image-registry-tls" not found Apr 16 13:57:42.996328 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:42.996291 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5wxx6" Apr 16 13:57:43.049338 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:43.049295 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls\") pod \"dns-default-g5lb6\" (UID: \"92799234-6fff-45be-a27c-c70096483d30\") " pod="openshift-dns/dns-default-g5lb6" Apr 16 13:57:43.049489 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:43.049352 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert\") pod \"ingress-canary-hgrw6\" (UID: \"b77dfd1a-6bd6-449b-8db3-c93ff41eb18e\") " pod="openshift-ingress-canary/ingress-canary-hgrw6" Apr 16 13:57:43.049489 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:43.049449 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:57:43.049563 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:43.049519 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls podName:92799234-6fff-45be-a27c-c70096483d30 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:47.049497514 +0000 UTC m=+41.227746379 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls") pod "dns-default-g5lb6" (UID: "92799234-6fff-45be-a27c-c70096483d30") : secret "dns-default-metrics-tls" not found Apr 16 13:57:43.049563 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:43.049529 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:57:43.049675 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:43.049591 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert podName:b77dfd1a-6bd6-449b-8db3-c93ff41eb18e nodeName:}" failed. No retries permitted until 2026-04-16 13:57:47.049571336 +0000 UTC m=+41.227820186 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert") pod "ingress-canary-hgrw6" (UID: "b77dfd1a-6bd6-449b-8db3-c93ff41eb18e") : secret "canary-serving-cert" not found Apr 16 13:57:43.384282 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:43.384031 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5wxx6"] Apr 16 13:57:43.442374 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:57:43.442340 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13c9904e_5fed_4ef4_845b_0a77e68bc8f7.slice/crio-b2d88dc01e62b6275b5f7de10fd74f2ccc69b790ee38ec3fa972655308045427 WatchSource:0}: Error finding container b2d88dc01e62b6275b5f7de10fd74f2ccc69b790ee38ec3fa972655308045427: Status 404 returned error can't find the container with id b2d88dc01e62b6275b5f7de10fd74f2ccc69b790ee38ec3fa972655308045427 Apr 16 13:57:43.533855 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:43.533817 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5wxx6" event={"ID":"13c9904e-5fed-4ef4-845b-0a77e68bc8f7","Type":"ContainerStarted","Data":"b2d88dc01e62b6275b5f7de10fd74f2ccc69b790ee38ec3fa972655308045427"} Apr 16 13:57:44.538204 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:44.538174 2575 generic.go:358] "Generic (PLEG): container finished" podID="85abdd4c-8c23-4bf2-b9a2-a5e83b75807a" containerID="ebd14dc543ffea73ad666e711163ef9204680a7d85bc3704198bcc69412f9f69" exitCode=0 Apr 16 13:57:44.538555 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:44.538234 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xjl2f" event={"ID":"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a","Type":"ContainerDied","Data":"ebd14dc543ffea73ad666e711163ef9204680a7d85bc3704198bcc69412f9f69"} Apr 16 13:57:45.543430 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:45.543396 2575 generic.go:358] "Generic (PLEG): container finished" podID="85abdd4c-8c23-4bf2-b9a2-a5e83b75807a" containerID="97666377c4e57544f5d0e6fd210a5ebba12f93f281c90e878fd5a57d07c42966" exitCode=0 Apr 16 13:57:45.543855 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:45.543462 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xjl2f" event={"ID":"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a","Type":"ContainerDied","Data":"97666377c4e57544f5d0e6fd210a5ebba12f93f281c90e878fd5a57d07c42966"} Apr 16 13:57:46.548916 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:46.548874 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xjl2f" event={"ID":"85abdd4c-8c23-4bf2-b9a2-a5e83b75807a","Type":"ContainerStarted","Data":"cc8c6cda66b43dc2011fd804eecb963feb98fcd351b27918c98f6143f939ae8b"} Apr 16 13:57:46.572298 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:46.572218 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xjl2f" podStartSLOduration=12.467650723 podStartE2EDuration="40.572203387s" podCreationTimestamp="2026-04-16 13:57:06 +0000 UTC" firstStartedPulling="2026-04-16 13:57:15.373374494 +0000 UTC m=+9.551623342" lastFinishedPulling="2026-04-16 13:57:43.477927157 +0000 UTC m=+37.656176006" observedRunningTime="2026-04-16 13:57:46.570903204 +0000 UTC m=+40.749152084" watchObservedRunningTime="2026-04-16 13:57:46.572203387 +0000 UTC m=+40.750452254" Apr 16 13:57:46.982450 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:46.982390 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:46.982636 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:46.982565 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:57:46.982636 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:46.982589 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b9d68db7f-fjmrf: secret "image-registry-tls" not found Apr 16 13:57:46.982739 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:46.982672 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls podName:f6503505-8fc2-4e22-b559-ade573fe4d03 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:54.982649745 +0000 UTC m=+49.160898605 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls") pod "image-registry-6b9d68db7f-fjmrf" (UID: "f6503505-8fc2-4e22-b559-ade573fe4d03") : secret "image-registry-tls" not found Apr 16 13:57:47.083375 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:47.083334 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls\") pod \"dns-default-g5lb6\" (UID: \"92799234-6fff-45be-a27c-c70096483d30\") " pod="openshift-dns/dns-default-g5lb6" Apr 16 13:57:47.083375 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:47.083385 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert\") pod \"ingress-canary-hgrw6\" (UID: \"b77dfd1a-6bd6-449b-8db3-c93ff41eb18e\") " pod="openshift-ingress-canary/ingress-canary-hgrw6" Apr 16 13:57:47.083601 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:47.083496 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:57:47.083601 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:47.083497 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:57:47.083601 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:47.083561 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert podName:b77dfd1a-6bd6-449b-8db3-c93ff41eb18e nodeName:}" failed. No retries permitted until 2026-04-16 13:57:55.083546798 +0000 UTC m=+49.261795643 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert") pod "ingress-canary-hgrw6" (UID: "b77dfd1a-6bd6-449b-8db3-c93ff41eb18e") : secret "canary-serving-cert" not found Apr 16 13:57:47.083601 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:47.083580 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls podName:92799234-6fff-45be-a27c-c70096483d30 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:55.083571932 +0000 UTC m=+49.261820783 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls") pod "dns-default-g5lb6" (UID: "92799234-6fff-45be-a27c-c70096483d30") : secret "dns-default-metrics-tls" not found Apr 16 13:57:48.553454 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:48.553364 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5wxx6" event={"ID":"13c9904e-5fed-4ef4-845b-0a77e68bc8f7","Type":"ContainerStarted","Data":"13431a9e88c86cedd5654cb24ee89d871c12dd192de12f0a739e390bdeb05911"} Apr 16 13:57:48.569232 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:48.569173 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-5wxx6" podStartSLOduration=33.899497703 podStartE2EDuration="38.569160208s" podCreationTimestamp="2026-04-16 13:57:10 +0000 UTC" firstStartedPulling="2026-04-16 13:57:43.455084277 +0000 UTC m=+37.633333126" lastFinishedPulling="2026-04-16 13:57:48.124746768 +0000 UTC m=+42.302995631" observedRunningTime="2026-04-16 13:57:48.568900482 +0000 UTC m=+42.747149354" watchObservedRunningTime="2026-04-16 13:57:48.569160208 +0000 UTC m=+42.747409076" Apr 16 13:57:55.043935 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:55.043894 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:57:55.044345 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:55.044049 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:57:55.044345 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:55.044068 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b9d68db7f-fjmrf: secret "image-registry-tls" not found Apr 16 13:57:55.044345 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:55.044125 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls podName:f6503505-8fc2-4e22-b559-ade573fe4d03 nodeName:}" failed. No retries permitted until 2026-04-16 13:58:11.044107852 +0000 UTC m=+65.222356698 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls") pod "image-registry-6b9d68db7f-fjmrf" (UID: "f6503505-8fc2-4e22-b559-ade573fe4d03") : secret "image-registry-tls" not found Apr 16 13:57:55.144994 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:55.144948 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls\") pod \"dns-default-g5lb6\" (UID: \"92799234-6fff-45be-a27c-c70096483d30\") " pod="openshift-dns/dns-default-g5lb6" Apr 16 13:57:55.144994 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:57:55.145002 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert\") pod \"ingress-canary-hgrw6\" (UID: \"b77dfd1a-6bd6-449b-8db3-c93ff41eb18e\") " pod="openshift-ingress-canary/ingress-canary-hgrw6" Apr 16 13:57:55.145296 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:55.145111 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:57:55.145296 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:55.145190 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls podName:92799234-6fff-45be-a27c-c70096483d30 nodeName:}" failed. No retries permitted until 2026-04-16 13:58:11.14516788 +0000 UTC m=+65.323416743 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls") pod "dns-default-g5lb6" (UID: "92799234-6fff-45be-a27c-c70096483d30") : secret "dns-default-metrics-tls" not found Apr 16 13:57:55.145296 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:55.145112 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:57:55.145296 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:57:55.145263 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert podName:b77dfd1a-6bd6-449b-8db3-c93ff41eb18e nodeName:}" failed. No retries permitted until 2026-04-16 13:58:11.145249527 +0000 UTC m=+65.323498376 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert") pod "ingress-canary-hgrw6" (UID: "b77dfd1a-6bd6-449b-8db3-c93ff41eb18e") : secret "canary-serving-cert" not found Apr 16 13:58:06.528891 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:58:06.528863 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qv8tl" Apr 16 13:58:11.056939 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:58:11.056891 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:58:11.057426 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:58:11.056976 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs\") pod \"network-metrics-daemon-crlsp\" (UID: \"77d59171-3e29-4e55-a4d9-a076a67a50ce\") " pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:58:11.057426 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:58:11.057048 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:58:11.057426 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:58:11.057072 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b9d68db7f-fjmrf: secret "image-registry-tls" not found Apr 16 13:58:11.057426 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:58:11.057137 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls podName:f6503505-8fc2-4e22-b559-ade573fe4d03 nodeName:}" failed. No retries permitted until 2026-04-16 13:58:43.057116664 +0000 UTC m=+97.235365517 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls") pod "image-registry-6b9d68db7f-fjmrf" (UID: "f6503505-8fc2-4e22-b559-ade573fe4d03") : secret "image-registry-tls" not found Apr 16 13:58:11.059565 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:58:11.059547 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 13:58:11.068153 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:58:11.068117 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 13:58:11.068235 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:58:11.068206 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs podName:77d59171-3e29-4e55-a4d9-a076a67a50ce nodeName:}" failed. No retries permitted until 2026-04-16 13:59:15.068188695 +0000 UTC m=+129.246437541 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs") pod "network-metrics-daemon-crlsp" (UID: "77d59171-3e29-4e55-a4d9-a076a67a50ce") : secret "metrics-daemon-secret" not found Apr 16 13:58:11.158001 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:58:11.157962 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls\") pod \"dns-default-g5lb6\" (UID: \"92799234-6fff-45be-a27c-c70096483d30\") " pod="openshift-dns/dns-default-g5lb6" Apr 16 13:58:11.158001 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:58:11.158001 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjl4q\" (UniqueName: \"kubernetes.io/projected/b45111e6-3682-445c-ac82-d2870a0cac78-kube-api-access-fjl4q\") pod \"network-check-target-96bs9\" (UID: \"b45111e6-3682-445c-ac82-d2870a0cac78\") " pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:58:11.158245 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:58:11.158023 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert\") pod \"ingress-canary-hgrw6\" (UID: \"b77dfd1a-6bd6-449b-8db3-c93ff41eb18e\") " pod="openshift-ingress-canary/ingress-canary-hgrw6" Apr 16 13:58:11.158245 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:58:11.158131 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:58:11.158245 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:58:11.158135 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:58:11.158245 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:58:11.158182 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert podName:b77dfd1a-6bd6-449b-8db3-c93ff41eb18e nodeName:}" failed. No retries permitted until 2026-04-16 13:58:43.158166851 +0000 UTC m=+97.336415697 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert") pod "ingress-canary-hgrw6" (UID: "b77dfd1a-6bd6-449b-8db3-c93ff41eb18e") : secret "canary-serving-cert" not found Apr 16 13:58:11.158245 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:58:11.158231 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls podName:92799234-6fff-45be-a27c-c70096483d30 nodeName:}" failed. No retries permitted until 2026-04-16 13:58:43.158211348 +0000 UTC m=+97.336460212 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls") pod "dns-default-g5lb6" (UID: "92799234-6fff-45be-a27c-c70096483d30") : secret "dns-default-metrics-tls" not found Apr 16 13:58:11.160670 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:58:11.160652 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 13:58:11.170624 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:58:11.170595 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 13:58:11.183294 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:58:11.183257 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjl4q\" (UniqueName: \"kubernetes.io/projected/b45111e6-3682-445c-ac82-d2870a0cac78-kube-api-access-fjl4q\") pod \"network-check-target-96bs9\" (UID: \"b45111e6-3682-445c-ac82-d2870a0cac78\") " pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:58:11.300615 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:58:11.300582 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-7n6t4\"" Apr 16 13:58:11.308647 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:58:11.308583 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:58:11.419753 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:58:11.419705 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-96bs9"] Apr 16 13:58:11.423295 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:58:11.423270 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb45111e6_3682_445c_ac82_d2870a0cac78.slice/crio-d1d181b4fb97f7d1ad73c3e62ab2f6752049cf6d0ce73cd69a89adb7fef37d21 WatchSource:0}: Error finding container d1d181b4fb97f7d1ad73c3e62ab2f6752049cf6d0ce73cd69a89adb7fef37d21: Status 404 returned error can't find the container with id d1d181b4fb97f7d1ad73c3e62ab2f6752049cf6d0ce73cd69a89adb7fef37d21 Apr 16 13:58:11.596246 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:58:11.596157 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-96bs9" event={"ID":"b45111e6-3682-445c-ac82-d2870a0cac78","Type":"ContainerStarted","Data":"d1d181b4fb97f7d1ad73c3e62ab2f6752049cf6d0ce73cd69a89adb7fef37d21"} Apr 16 13:58:16.605832 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:58:16.605798 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-96bs9" event={"ID":"b45111e6-3682-445c-ac82-d2870a0cac78","Type":"ContainerStarted","Data":"6a5c985078998a7726cc802f93ef1afb87c30fa6dd4f783ae22d43afd72cbc8b"} Apr 16 13:58:16.606280 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:58:16.605934 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:58:16.621214 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:58:16.621160 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-96bs9" podStartSLOduration=66.228868748 podStartE2EDuration="1m10.62114513s" podCreationTimestamp="2026-04-16 13:57:06 +0000 UTC" firstStartedPulling="2026-04-16 13:58:11.425543752 +0000 UTC m=+65.603792598" lastFinishedPulling="2026-04-16 13:58:15.817820135 +0000 UTC m=+69.996068980" observedRunningTime="2026-04-16 13:58:16.620767193 +0000 UTC m=+70.799016062" watchObservedRunningTime="2026-04-16 13:58:16.62114513 +0000 UTC m=+70.799394023" Apr 16 13:58:43.090334 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:58:43.090288 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:58:43.090852 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:58:43.090448 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:58:43.090852 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:58:43.090470 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b9d68db7f-fjmrf: secret "image-registry-tls" not found Apr 16 13:58:43.090852 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:58:43.090538 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls podName:f6503505-8fc2-4e22-b559-ade573fe4d03 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:47.090521615 +0000 UTC m=+161.268770461 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls") pod "image-registry-6b9d68db7f-fjmrf" (UID: "f6503505-8fc2-4e22-b559-ade573fe4d03") : secret "image-registry-tls" not found Apr 16 13:58:43.191158 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:58:43.191113 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls\") pod \"dns-default-g5lb6\" (UID: \"92799234-6fff-45be-a27c-c70096483d30\") " pod="openshift-dns/dns-default-g5lb6" Apr 16 13:58:43.191158 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:58:43.191161 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert\") pod \"ingress-canary-hgrw6\" (UID: \"b77dfd1a-6bd6-449b-8db3-c93ff41eb18e\") " pod="openshift-ingress-canary/ingress-canary-hgrw6" Apr 16 13:58:43.191339 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:58:43.191255 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:58:43.191339 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:58:43.191260 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:58:43.191339 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:58:43.191312 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert podName:b77dfd1a-6bd6-449b-8db3-c93ff41eb18e nodeName:}" failed. No retries permitted until 2026-04-16 13:59:47.191298107 +0000 UTC m=+161.369546958 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert") pod "ingress-canary-hgrw6" (UID: "b77dfd1a-6bd6-449b-8db3-c93ff41eb18e") : secret "canary-serving-cert" not found Apr 16 13:58:43.191339 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:58:43.191327 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls podName:92799234-6fff-45be-a27c-c70096483d30 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:47.191320586 +0000 UTC m=+161.369569432 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls") pod "dns-default-g5lb6" (UID: "92799234-6fff-45be-a27c-c70096483d30") : secret "dns-default-metrics-tls" not found Apr 16 13:58:47.610175 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:58:47.610141 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-96bs9" Apr 16 13:59:15.120180 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:15.120138 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs\") pod \"network-metrics-daemon-crlsp\" (UID: \"77d59171-3e29-4e55-a4d9-a076a67a50ce\") " pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:59:15.120651 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:15.120279 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 13:59:15.120651 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:15.120343 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs podName:77d59171-3e29-4e55-a4d9-a076a67a50ce nodeName:}" failed. No retries permitted until 2026-04-16 14:01:17.120325552 +0000 UTC m=+251.298574398 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs") pod "network-metrics-daemon-crlsp" (UID: "77d59171-3e29-4e55-a4d9-a076a67a50ce") : secret "metrics-daemon-secret" not found Apr 16 13:59:42.252410 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:42.252360 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" podUID="f6503505-8fc2-4e22-b559-ade573fe4d03" Apr 16 13:59:42.264552 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:42.264517 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-g5lb6" podUID="92799234-6fff-45be-a27c-c70096483d30" Apr 16 13:59:42.273097 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:42.273071 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-hgrw6" podUID="b77dfd1a-6bd6-449b-8db3-c93ff41eb18e" Apr 16 13:59:42.769619 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.769586 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hgrw6" Apr 16 13:59:42.769799 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.769657 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g5lb6" Apr 16 13:59:42.769799 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.769676 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:59:42.881766 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.881719 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-rqmhf"] Apr 16 13:59:42.884438 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.884424 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-rqmhf" Apr 16 13:59:42.888086 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.888063 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-mwdgw\"" Apr 16 13:59:42.888086 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.888079 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:42.888694 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.888666 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:42.890405 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.890386 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-5zl6t"] Apr 16 13:59:42.893079 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.893065 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-zxb7l"] Apr 16 13:59:42.893206 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.893192 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-5zl6t" Apr 16 13:59:42.895809 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.895794 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" Apr 16 13:59:42.898997 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.898976 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 13:59:42.899095 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.898997 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 13:59:42.899095 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.899064 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 13:59:42.899199 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.899113 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-jmxx7\"" Apr 16 13:59:42.899199 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.899173 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 13:59:42.899837 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.899820 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:42.904337 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.904319 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 13:59:42.904475 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.904457 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:42.904534 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.904483 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 13:59:42.905041 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.905021 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-6bbxs\"" Apr 16 13:59:42.905432 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.905410 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-rqmhf"] Apr 16 13:59:42.911597 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.911566 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 13:59:42.913313 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.913288 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 13:59:42.916416 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.916398 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5t49\" (UniqueName: \"kubernetes.io/projected/18e2e1da-09fc-4969-99a4-1d53b1a12d83-kube-api-access-g5t49\") pod \"insights-operator-5785d4fcdd-5zl6t\" (UID: \"18e2e1da-09fc-4969-99a4-1d53b1a12d83\") " pod="openshift-insights/insights-operator-5785d4fcdd-5zl6t" Apr 16 13:59:42.916530 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.916431 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xht2\" (UniqueName: \"kubernetes.io/projected/24046d5b-b6df-4005-85d5-01cafc82cc40-kube-api-access-6xht2\") pod \"console-operator-d87b8d5fc-zxb7l\" (UID: \"24046d5b-b6df-4005-85d5-01cafc82cc40\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" Apr 16 13:59:42.916530 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.916452 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7qjd\" (UniqueName: \"kubernetes.io/projected/331d1069-9cb3-438e-a5bd-46015afcf351-kube-api-access-x7qjd\") pod \"volume-data-source-validator-7d955d5dd4-rqmhf\" (UID: \"331d1069-9cb3-438e-a5bd-46015afcf351\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-rqmhf" Apr 16 13:59:42.916530 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.916521 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/18e2e1da-09fc-4969-99a4-1d53b1a12d83-tmp\") pod \"insights-operator-5785d4fcdd-5zl6t\" (UID: \"18e2e1da-09fc-4969-99a4-1d53b1a12d83\") " pod="openshift-insights/insights-operator-5785d4fcdd-5zl6t" Apr 16 13:59:42.916662 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.916565 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24046d5b-b6df-4005-85d5-01cafc82cc40-config\") pod \"console-operator-d87b8d5fc-zxb7l\" (UID: \"24046d5b-b6df-4005-85d5-01cafc82cc40\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" Apr 16 13:59:42.916662 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.916588 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24046d5b-b6df-4005-85d5-01cafc82cc40-serving-cert\") pod \"console-operator-d87b8d5fc-zxb7l\" (UID: \"24046d5b-b6df-4005-85d5-01cafc82cc40\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" Apr 16 13:59:42.916662 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.916609 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/18e2e1da-09fc-4969-99a4-1d53b1a12d83-snapshots\") pod \"insights-operator-5785d4fcdd-5zl6t\" (UID: \"18e2e1da-09fc-4969-99a4-1d53b1a12d83\") " pod="openshift-insights/insights-operator-5785d4fcdd-5zl6t" Apr 16 13:59:42.916662 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.916648 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18e2e1da-09fc-4969-99a4-1d53b1a12d83-serving-cert\") pod \"insights-operator-5785d4fcdd-5zl6t\" (UID: \"18e2e1da-09fc-4969-99a4-1d53b1a12d83\") " pod="openshift-insights/insights-operator-5785d4fcdd-5zl6t" Apr 16 13:59:42.916871 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.916689 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18e2e1da-09fc-4969-99a4-1d53b1a12d83-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-5zl6t\" (UID: \"18e2e1da-09fc-4969-99a4-1d53b1a12d83\") " pod="openshift-insights/insights-operator-5785d4fcdd-5zl6t" Apr 16 13:59:42.916871 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.916715 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24046d5b-b6df-4005-85d5-01cafc82cc40-trusted-ca\") pod \"console-operator-d87b8d5fc-zxb7l\" (UID: \"24046d5b-b6df-4005-85d5-01cafc82cc40\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" Apr 16 13:59:42.916871 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.916802 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18e2e1da-09fc-4969-99a4-1d53b1a12d83-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-5zl6t\" (UID: \"18e2e1da-09fc-4969-99a4-1d53b1a12d83\") " pod="openshift-insights/insights-operator-5785d4fcdd-5zl6t" Apr 16 13:59:42.922239 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.922218 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-5zl6t"] Apr 16 13:59:42.922834 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.922816 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-zxb7l"] Apr 16 13:59:42.991884 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.991844 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-98npk"] Apr 16 13:59:42.994643 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.994626 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qsrlw"] Apr 16 13:59:42.994849 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.994826 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-98npk" Apr 16 13:59:42.997409 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.997187 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-94g8p"] Apr 16 13:59:42.997589 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.997572 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qsrlw" Apr 16 13:59:42.999048 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.999027 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:42.999256 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.999239 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-b7rpx\"" Apr 16 13:59:42.999538 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.999522 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 13:59:42.999538 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.999530 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:42.999880 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:42.999867 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-94g8p" Apr 16 13:59:43.002897 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.002881 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-t5jxs\"" Apr 16 13:59:43.004114 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.004096 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:43.004190 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.004152 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-mfz9z\"" Apr 16 13:59:43.004190 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.004171 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 13:59:43.004887 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.004870 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:43.005115 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.005095 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 13:59:43.015441 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.015413 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-98npk"] Apr 16 13:59:43.018127 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.018102 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8c123a6-dc70-4989-aff2-c7374863a689-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qsrlw\" (UID: \"e8c123a6-dc70-4989-aff2-c7374863a689\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qsrlw" Apr 16 13:59:43.018230 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.018152 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6xht2\" (UniqueName: \"kubernetes.io/projected/24046d5b-b6df-4005-85d5-01cafc82cc40-kube-api-access-6xht2\") pod \"console-operator-d87b8d5fc-zxb7l\" (UID: \"24046d5b-b6df-4005-85d5-01cafc82cc40\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" Apr 16 13:59:43.018230 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.018216 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls66l\" (UniqueName: \"kubernetes.io/projected/6597c240-5448-4187-9f6c-4e1e6d7b7aa0-kube-api-access-ls66l\") pod \"network-check-source-7b678d77c7-94g8p\" (UID: \"6597c240-5448-4187-9f6c-4e1e6d7b7aa0\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-94g8p" Apr 16 13:59:43.018296 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.018254 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24046d5b-b6df-4005-85d5-01cafc82cc40-serving-cert\") pod \"console-operator-d87b8d5fc-zxb7l\" (UID: \"24046d5b-b6df-4005-85d5-01cafc82cc40\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" Apr 16 13:59:43.018296 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.018279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/18e2e1da-09fc-4969-99a4-1d53b1a12d83-snapshots\") pod \"insights-operator-5785d4fcdd-5zl6t\" (UID: \"18e2e1da-09fc-4969-99a4-1d53b1a12d83\") " pod="openshift-insights/insights-operator-5785d4fcdd-5zl6t" Apr 16 13:59:43.018381 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.018298 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18e2e1da-09fc-4969-99a4-1d53b1a12d83-serving-cert\") pod \"insights-operator-5785d4fcdd-5zl6t\" (UID: \"18e2e1da-09fc-4969-99a4-1d53b1a12d83\") " pod="openshift-insights/insights-operator-5785d4fcdd-5zl6t" Apr 16 13:59:43.018381 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.018342 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18e2e1da-09fc-4969-99a4-1d53b1a12d83-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-5zl6t\" (UID: \"18e2e1da-09fc-4969-99a4-1d53b1a12d83\") " pod="openshift-insights/insights-operator-5785d4fcdd-5zl6t" Apr 16 13:59:43.018477 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.018446 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24046d5b-b6df-4005-85d5-01cafc82cc40-trusted-ca\") pod \"console-operator-d87b8d5fc-zxb7l\" (UID: \"24046d5b-b6df-4005-85d5-01cafc82cc40\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" Apr 16 13:59:43.018542 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.018522 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/710be765-3a16-444a-b531-1c251066d20c-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-98npk\" (UID: \"710be765-3a16-444a-b531-1c251066d20c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-98npk" Apr 16 13:59:43.018660 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.018573 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8c123a6-dc70-4989-aff2-c7374863a689-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qsrlw\" (UID: \"e8c123a6-dc70-4989-aff2-c7374863a689\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qsrlw" Apr 16 13:59:43.018660 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.018624 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5t49\" (UniqueName: \"kubernetes.io/projected/18e2e1da-09fc-4969-99a4-1d53b1a12d83-kube-api-access-g5t49\") pod \"insights-operator-5785d4fcdd-5zl6t\" (UID: \"18e2e1da-09fc-4969-99a4-1d53b1a12d83\") " pod="openshift-insights/insights-operator-5785d4fcdd-5zl6t" Apr 16 13:59:43.018799 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.018673 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7qjd\" (UniqueName: \"kubernetes.io/projected/331d1069-9cb3-438e-a5bd-46015afcf351-kube-api-access-x7qjd\") pod \"volume-data-source-validator-7d955d5dd4-rqmhf\" (UID: \"331d1069-9cb3-438e-a5bd-46015afcf351\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-rqmhf" Apr 16 13:59:43.018799 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.018699 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/18e2e1da-09fc-4969-99a4-1d53b1a12d83-tmp\") pod \"insights-operator-5785d4fcdd-5zl6t\" (UID: \"18e2e1da-09fc-4969-99a4-1d53b1a12d83\") " pod="openshift-insights/insights-operator-5785d4fcdd-5zl6t" Apr 16 13:59:43.018799 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.018763 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn7qk\" (UniqueName: \"kubernetes.io/projected/710be765-3a16-444a-b531-1c251066d20c-kube-api-access-gn7qk\") pod \"cluster-samples-operator-667775844f-98npk\" (UID: \"710be765-3a16-444a-b531-1c251066d20c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-98npk" Apr 16 13:59:43.018942 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.018885 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24046d5b-b6df-4005-85d5-01cafc82cc40-config\") pod \"console-operator-d87b8d5fc-zxb7l\" (UID: \"24046d5b-b6df-4005-85d5-01cafc82cc40\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" Apr 16 13:59:43.018993 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.018941 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18e2e1da-09fc-4969-99a4-1d53b1a12d83-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-5zl6t\" (UID: \"18e2e1da-09fc-4969-99a4-1d53b1a12d83\") " pod="openshift-insights/insights-operator-5785d4fcdd-5zl6t" Apr 16 13:59:43.018993 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.018982 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/18e2e1da-09fc-4969-99a4-1d53b1a12d83-snapshots\") pod \"insights-operator-5785d4fcdd-5zl6t\" (UID: \"18e2e1da-09fc-4969-99a4-1d53b1a12d83\") " pod="openshift-insights/insights-operator-5785d4fcdd-5zl6t" Apr 16 13:59:43.019089 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.019030 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwjf6\" (UniqueName: \"kubernetes.io/projected/e8c123a6-dc70-4989-aff2-c7374863a689-kube-api-access-wwjf6\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qsrlw\" (UID: \"e8c123a6-dc70-4989-aff2-c7374863a689\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qsrlw" Apr 16 13:59:43.019147 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.019126 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18e2e1da-09fc-4969-99a4-1d53b1a12d83-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-5zl6t\" (UID: \"18e2e1da-09fc-4969-99a4-1d53b1a12d83\") " pod="openshift-insights/insights-operator-5785d4fcdd-5zl6t" Apr 16 13:59:43.019261 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.019240 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/18e2e1da-09fc-4969-99a4-1d53b1a12d83-tmp\") pod \"insights-operator-5785d4fcdd-5zl6t\" (UID: \"18e2e1da-09fc-4969-99a4-1d53b1a12d83\") " pod="openshift-insights/insights-operator-5785d4fcdd-5zl6t" Apr 16 13:59:43.019543 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.019523 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24046d5b-b6df-4005-85d5-01cafc82cc40-trusted-ca\") pod \"console-operator-d87b8d5fc-zxb7l\" (UID: \"24046d5b-b6df-4005-85d5-01cafc82cc40\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" Apr 16 13:59:43.019648 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.019609 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24046d5b-b6df-4005-85d5-01cafc82cc40-config\") pod \"console-operator-d87b8d5fc-zxb7l\" (UID: \"24046d5b-b6df-4005-85d5-01cafc82cc40\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" Apr 16 13:59:43.020300 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.020279 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18e2e1da-09fc-4969-99a4-1d53b1a12d83-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-5zl6t\" (UID: \"18e2e1da-09fc-4969-99a4-1d53b1a12d83\") " pod="openshift-insights/insights-operator-5785d4fcdd-5zl6t" Apr 16 13:59:43.021655 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.021517 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18e2e1da-09fc-4969-99a4-1d53b1a12d83-serving-cert\") pod \"insights-operator-5785d4fcdd-5zl6t\" (UID: \"18e2e1da-09fc-4969-99a4-1d53b1a12d83\") " pod="openshift-insights/insights-operator-5785d4fcdd-5zl6t" Apr 16 13:59:43.021884 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.021865 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24046d5b-b6df-4005-85d5-01cafc82cc40-serving-cert\") pod \"console-operator-d87b8d5fc-zxb7l\" (UID: \"24046d5b-b6df-4005-85d5-01cafc82cc40\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" Apr 16 13:59:43.023306 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.023284 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-94g8p"] Apr 16 13:59:43.024467 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.024446 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qsrlw"] Apr 16 13:59:43.036864 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.036836 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xht2\" (UniqueName: \"kubernetes.io/projected/24046d5b-b6df-4005-85d5-01cafc82cc40-kube-api-access-6xht2\") pod \"console-operator-d87b8d5fc-zxb7l\" (UID: \"24046d5b-b6df-4005-85d5-01cafc82cc40\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" Apr 16 13:59:43.051824 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.051795 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7qjd\" (UniqueName: \"kubernetes.io/projected/331d1069-9cb3-438e-a5bd-46015afcf351-kube-api-access-x7qjd\") pod \"volume-data-source-validator-7d955d5dd4-rqmhf\" (UID: \"331d1069-9cb3-438e-a5bd-46015afcf351\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-rqmhf" Apr 16 13:59:43.055746 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.055702 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5t49\" (UniqueName: \"kubernetes.io/projected/18e2e1da-09fc-4969-99a4-1d53b1a12d83-kube-api-access-g5t49\") pod \"insights-operator-5785d4fcdd-5zl6t\" (UID: \"18e2e1da-09fc-4969-99a4-1d53b1a12d83\") " pod="openshift-insights/insights-operator-5785d4fcdd-5zl6t" Apr 16 13:59:43.119989 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.119949 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwjf6\" (UniqueName: \"kubernetes.io/projected/e8c123a6-dc70-4989-aff2-c7374863a689-kube-api-access-wwjf6\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qsrlw\" (UID: \"e8c123a6-dc70-4989-aff2-c7374863a689\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qsrlw" Apr 16 13:59:43.120190 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.120012 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8c123a6-dc70-4989-aff2-c7374863a689-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qsrlw\" (UID: \"e8c123a6-dc70-4989-aff2-c7374863a689\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qsrlw" Apr 16 13:59:43.120190 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.120049 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ls66l\" (UniqueName: \"kubernetes.io/projected/6597c240-5448-4187-9f6c-4e1e6d7b7aa0-kube-api-access-ls66l\") pod \"network-check-source-7b678d77c7-94g8p\" (UID: \"6597c240-5448-4187-9f6c-4e1e6d7b7aa0\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-94g8p" Apr 16 13:59:43.120190 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.120095 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/710be765-3a16-444a-b531-1c251066d20c-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-98npk\" (UID: \"710be765-3a16-444a-b531-1c251066d20c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-98npk" Apr 16 13:59:43.120190 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.120120 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8c123a6-dc70-4989-aff2-c7374863a689-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qsrlw\" (UID: \"e8c123a6-dc70-4989-aff2-c7374863a689\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qsrlw" Apr 16 13:59:43.120190 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.120157 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gn7qk\" (UniqueName: \"kubernetes.io/projected/710be765-3a16-444a-b531-1c251066d20c-kube-api-access-gn7qk\") pod \"cluster-samples-operator-667775844f-98npk\" (UID: \"710be765-3a16-444a-b531-1c251066d20c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-98npk" Apr 16 13:59:43.120443 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:43.120202 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 13:59:43.120443 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:43.120280 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/710be765-3a16-444a-b531-1c251066d20c-samples-operator-tls podName:710be765-3a16-444a-b531-1c251066d20c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:43.620259396 +0000 UTC m=+157.798508251 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/710be765-3a16-444a-b531-1c251066d20c-samples-operator-tls") pod "cluster-samples-operator-667775844f-98npk" (UID: "710be765-3a16-444a-b531-1c251066d20c") : secret "samples-operator-tls" not found Apr 16 13:59:43.120699 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.120669 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8c123a6-dc70-4989-aff2-c7374863a689-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qsrlw\" (UID: \"e8c123a6-dc70-4989-aff2-c7374863a689\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qsrlw" Apr 16 13:59:43.122392 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.122365 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8c123a6-dc70-4989-aff2-c7374863a689-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qsrlw\" (UID: \"e8c123a6-dc70-4989-aff2-c7374863a689\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qsrlw" Apr 16 13:59:43.131615 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.131584 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn7qk\" (UniqueName: \"kubernetes.io/projected/710be765-3a16-444a-b531-1c251066d20c-kube-api-access-gn7qk\") pod \"cluster-samples-operator-667775844f-98npk\" (UID: \"710be765-3a16-444a-b531-1c251066d20c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-98npk" Apr 16 13:59:43.131714 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.131692 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwjf6\" (UniqueName: \"kubernetes.io/projected/e8c123a6-dc70-4989-aff2-c7374863a689-kube-api-access-wwjf6\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qsrlw\" (UID: \"e8c123a6-dc70-4989-aff2-c7374863a689\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qsrlw" Apr 16 13:59:43.135047 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.135029 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls66l\" (UniqueName: \"kubernetes.io/projected/6597c240-5448-4187-9f6c-4e1e6d7b7aa0-kube-api-access-ls66l\") pod \"network-check-source-7b678d77c7-94g8p\" (UID: \"6597c240-5448-4187-9f6c-4e1e6d7b7aa0\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-94g8p" Apr 16 13:59:43.193076 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.193043 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-rqmhf" Apr 16 13:59:43.201918 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.201895 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-5zl6t" Apr 16 13:59:43.208609 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.208581 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" Apr 16 13:59:43.311227 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.311190 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qsrlw" Apr 16 13:59:43.316199 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.316170 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-94g8p" Apr 16 13:59:43.330157 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.330123 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-rqmhf"] Apr 16 13:59:43.333893 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:59:43.333855 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod331d1069_9cb3_438e_a5bd_46015afcf351.slice/crio-b98522f47d5a4c8455d130144378fdc80d0f64ce2a54efd7d1bb8bb707f95c7d WatchSource:0}: Error finding container b98522f47d5a4c8455d130144378fdc80d0f64ce2a54efd7d1bb8bb707f95c7d: Status 404 returned error can't find the container with id b98522f47d5a4c8455d130144378fdc80d0f64ce2a54efd7d1bb8bb707f95c7d Apr 16 13:59:43.407200 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:43.407160 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-crlsp" podUID="77d59171-3e29-4e55-a4d9-a076a67a50ce" Apr 16 13:59:43.440196 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.440156 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qsrlw"] Apr 16 13:59:43.443155 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:59:43.443125 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8c123a6_dc70_4989_aff2_c7374863a689.slice/crio-275f94ff4909fe0590d86e4295a25cfff9fc8dab494b3010d7a155df5cc2d285 WatchSource:0}: Error finding container 275f94ff4909fe0590d86e4295a25cfff9fc8dab494b3010d7a155df5cc2d285: Status 404 returned error can't find the container with id 275f94ff4909fe0590d86e4295a25cfff9fc8dab494b3010d7a155df5cc2d285 Apr 16 13:59:43.452308 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.452284 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-94g8p"] Apr 16 13:59:43.455132 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:59:43.455109 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6597c240_5448_4187_9f6c_4e1e6d7b7aa0.slice/crio-467bc6b3312b6f0ed3d13931a2053370dba93e0aa624685f9e977df1a6a08b23 WatchSource:0}: Error finding container 467bc6b3312b6f0ed3d13931a2053370dba93e0aa624685f9e977df1a6a08b23: Status 404 returned error can't find the container with id 467bc6b3312b6f0ed3d13931a2053370dba93e0aa624685f9e977df1a6a08b23 Apr 16 13:59:43.560266 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.558358 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-zxb7l"] Apr 16 13:59:43.562150 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.562064 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-5zl6t"] Apr 16 13:59:43.564802 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:59:43.564776 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24046d5b_b6df_4005_85d5_01cafc82cc40.slice/crio-2139a78c78061e1e024cc51adf475eb6096d92a55970b027e497153fa10dd345 WatchSource:0}: Error finding container 2139a78c78061e1e024cc51adf475eb6096d92a55970b027e497153fa10dd345: Status 404 returned error can't find the container with id 2139a78c78061e1e024cc51adf475eb6096d92a55970b027e497153fa10dd345 Apr 16 13:59:43.565437 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:59:43.565418 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18e2e1da_09fc_4969_99a4_1d53b1a12d83.slice/crio-0fdaaca5d1d04a81036cfd9d7db1fb93a4e596894748a266d57bfb179c98b268 WatchSource:0}: Error finding container 0fdaaca5d1d04a81036cfd9d7db1fb93a4e596894748a266d57bfb179c98b268: Status 404 returned error can't find the container with id 0fdaaca5d1d04a81036cfd9d7db1fb93a4e596894748a266d57bfb179c98b268 Apr 16 13:59:43.623957 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.623917 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/710be765-3a16-444a-b531-1c251066d20c-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-98npk\" (UID: \"710be765-3a16-444a-b531-1c251066d20c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-98npk" Apr 16 13:59:43.624129 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:43.624064 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 13:59:43.624169 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:43.624158 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/710be765-3a16-444a-b531-1c251066d20c-samples-operator-tls podName:710be765-3a16-444a-b531-1c251066d20c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:44.624139468 +0000 UTC m=+158.802388319 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/710be765-3a16-444a-b531-1c251066d20c-samples-operator-tls") pod "cluster-samples-operator-667775844f-98npk" (UID: "710be765-3a16-444a-b531-1c251066d20c") : secret "samples-operator-tls" not found Apr 16 13:59:43.772924 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.772887 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qsrlw" event={"ID":"e8c123a6-dc70-4989-aff2-c7374863a689","Type":"ContainerStarted","Data":"275f94ff4909fe0590d86e4295a25cfff9fc8dab494b3010d7a155df5cc2d285"} Apr 16 13:59:43.773932 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.773905 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-rqmhf" event={"ID":"331d1069-9cb3-438e-a5bd-46015afcf351","Type":"ContainerStarted","Data":"b98522f47d5a4c8455d130144378fdc80d0f64ce2a54efd7d1bb8bb707f95c7d"} Apr 16 13:59:43.775337 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.775314 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-94g8p" event={"ID":"6597c240-5448-4187-9f6c-4e1e6d7b7aa0","Type":"ContainerStarted","Data":"3ae5c66e8680d3ddd4956cc1b238952ebe848ebab42d7d4e3817c3874cb81276"} Apr 16 13:59:43.775441 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.775343 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-94g8p" event={"ID":"6597c240-5448-4187-9f6c-4e1e6d7b7aa0","Type":"ContainerStarted","Data":"467bc6b3312b6f0ed3d13931a2053370dba93e0aa624685f9e977df1a6a08b23"} Apr 16 13:59:43.776312 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.776288 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-5zl6t" event={"ID":"18e2e1da-09fc-4969-99a4-1d53b1a12d83","Type":"ContainerStarted","Data":"0fdaaca5d1d04a81036cfd9d7db1fb93a4e596894748a266d57bfb179c98b268"} Apr 16 13:59:43.777197 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.777180 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" event={"ID":"24046d5b-b6df-4005-85d5-01cafc82cc40","Type":"ContainerStarted","Data":"2139a78c78061e1e024cc51adf475eb6096d92a55970b027e497153fa10dd345"} Apr 16 13:59:43.796258 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:43.796205 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-94g8p" podStartSLOduration=1.796186796 podStartE2EDuration="1.796186796s" podCreationTimestamp="2026-04-16 13:59:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:43.7956509 +0000 UTC m=+157.973899772" watchObservedRunningTime="2026-04-16 13:59:43.796186796 +0000 UTC m=+157.974435667" Apr 16 13:59:44.634246 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:44.634205 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/710be765-3a16-444a-b531-1c251066d20c-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-98npk\" (UID: \"710be765-3a16-444a-b531-1c251066d20c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-98npk" Apr 16 13:59:44.634702 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:44.634390 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 13:59:44.634702 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:44.634453 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/710be765-3a16-444a-b531-1c251066d20c-samples-operator-tls podName:710be765-3a16-444a-b531-1c251066d20c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:46.634435272 +0000 UTC m=+160.812684124 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/710be765-3a16-444a-b531-1c251066d20c-samples-operator-tls") pod "cluster-samples-operator-667775844f-98npk" (UID: "710be765-3a16-444a-b531-1c251066d20c") : secret "samples-operator-tls" not found Apr 16 13:59:45.668147 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:45.668111 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-6mm5s"] Apr 16 13:59:45.672602 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:45.672577 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6mm5s" Apr 16 13:59:45.674971 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:45.674944 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 13:59:45.675102 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:45.675033 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-5w75q\"" Apr 16 13:59:45.675102 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:45.675042 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 13:59:45.681713 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:45.681683 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-6mm5s"] Apr 16 13:59:45.745396 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:45.745354 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e84621c2-6f3b-487c-8a13-426a6d91539c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6mm5s\" (UID: \"e84621c2-6f3b-487c-8a13-426a6d91539c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6mm5s" Apr 16 13:59:45.745588 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:45.745463 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e84621c2-6f3b-487c-8a13-426a6d91539c-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-6mm5s\" (UID: \"e84621c2-6f3b-487c-8a13-426a6d91539c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6mm5s" Apr 16 13:59:45.784104 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:45.784065 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-rqmhf" event={"ID":"331d1069-9cb3-438e-a5bd-46015afcf351","Type":"ContainerStarted","Data":"af7dbf32d25570c85c867ea89b3c2e86429d17dd13b1627f753118759181fe7d"} Apr 16 13:59:45.800093 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:45.800040 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-rqmhf" podStartSLOduration=2.097281901 podStartE2EDuration="3.800020799s" podCreationTimestamp="2026-04-16 13:59:42 +0000 UTC" firstStartedPulling="2026-04-16 13:59:43.336307163 +0000 UTC m=+157.514556009" lastFinishedPulling="2026-04-16 13:59:45.03904606 +0000 UTC m=+159.217294907" observedRunningTime="2026-04-16 13:59:45.799299796 +0000 UTC m=+159.977548667" watchObservedRunningTime="2026-04-16 13:59:45.800020799 +0000 UTC m=+159.978269668" Apr 16 13:59:45.846020 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:45.845976 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e84621c2-6f3b-487c-8a13-426a6d91539c-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-6mm5s\" (UID: \"e84621c2-6f3b-487c-8a13-426a6d91539c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6mm5s" Apr 16 13:59:45.846201 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:45.846150 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e84621c2-6f3b-487c-8a13-426a6d91539c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6mm5s\" (UID: \"e84621c2-6f3b-487c-8a13-426a6d91539c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6mm5s" Apr 16 13:59:45.846307 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:45.846283 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 13:59:45.846447 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:45.846358 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e84621c2-6f3b-487c-8a13-426a6d91539c-networking-console-plugin-cert podName:e84621c2-6f3b-487c-8a13-426a6d91539c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:46.34633783 +0000 UTC m=+160.524586680 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e84621c2-6f3b-487c-8a13-426a6d91539c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-6mm5s" (UID: "e84621c2-6f3b-487c-8a13-426a6d91539c") : secret "networking-console-plugin-cert" not found Apr 16 13:59:45.846747 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:45.846712 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e84621c2-6f3b-487c-8a13-426a6d91539c-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-6mm5s\" (UID: \"e84621c2-6f3b-487c-8a13-426a6d91539c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6mm5s" Apr 16 13:59:46.350077 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:46.350045 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e84621c2-6f3b-487c-8a13-426a6d91539c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6mm5s\" (UID: \"e84621c2-6f3b-487c-8a13-426a6d91539c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6mm5s" Apr 16 13:59:46.350252 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:46.350209 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 13:59:46.350298 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:46.350280 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e84621c2-6f3b-487c-8a13-426a6d91539c-networking-console-plugin-cert podName:e84621c2-6f3b-487c-8a13-426a6d91539c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:47.350259987 +0000 UTC m=+161.528508848 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e84621c2-6f3b-487c-8a13-426a6d91539c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-6mm5s" (UID: "e84621c2-6f3b-487c-8a13-426a6d91539c") : secret "networking-console-plugin-cert" not found Apr 16 13:59:46.652841 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:46.652814 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/710be765-3a16-444a-b531-1c251066d20c-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-98npk\" (UID: \"710be765-3a16-444a-b531-1c251066d20c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-98npk" Apr 16 13:59:46.652973 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:46.652958 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 13:59:46.653024 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:46.653015 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/710be765-3a16-444a-b531-1c251066d20c-samples-operator-tls podName:710be765-3a16-444a-b531-1c251066d20c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:50.652998295 +0000 UTC m=+164.831247153 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/710be765-3a16-444a-b531-1c251066d20c-samples-operator-tls") pod "cluster-samples-operator-667775844f-98npk" (UID: "710be765-3a16-444a-b531-1c251066d20c") : secret "samples-operator-tls" not found Apr 16 13:59:46.787864 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:46.787833 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-5zl6t" event={"ID":"18e2e1da-09fc-4969-99a4-1d53b1a12d83","Type":"ContainerStarted","Data":"9f43884104bf23b1d2ebaf5bf1f2429abd76b2baa1d7149a4e12d51c967ad5b7"} Apr 16 13:59:46.789330 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:46.789301 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zxb7l_24046d5b-b6df-4005-85d5-01cafc82cc40/console-operator/0.log" Apr 16 13:59:46.789423 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:46.789354 2575 generic.go:358] "Generic (PLEG): container finished" podID="24046d5b-b6df-4005-85d5-01cafc82cc40" containerID="b56d8dc02506898a418ee2367ac759399ee47e62cae1d98456a2b1796fa61a4d" exitCode=255 Apr 16 13:59:46.789472 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:46.789435 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" event={"ID":"24046d5b-b6df-4005-85d5-01cafc82cc40","Type":"ContainerDied","Data":"b56d8dc02506898a418ee2367ac759399ee47e62cae1d98456a2b1796fa61a4d"} Apr 16 13:59:46.789633 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:46.789611 2575 scope.go:117] "RemoveContainer" containerID="b56d8dc02506898a418ee2367ac759399ee47e62cae1d98456a2b1796fa61a4d" Apr 16 13:59:46.790836 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:46.790812 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qsrlw" event={"ID":"e8c123a6-dc70-4989-aff2-c7374863a689","Type":"ContainerStarted","Data":"3a8385718364b68591b2ddebde45ad0fe00c9e496a774f7dc81b33807c796bad"} Apr 16 13:59:46.803382 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:46.803310 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-5zl6t" podStartSLOduration=1.728192854 podStartE2EDuration="4.803297463s" podCreationTimestamp="2026-04-16 13:59:42 +0000 UTC" firstStartedPulling="2026-04-16 13:59:43.567503558 +0000 UTC m=+157.745752416" lastFinishedPulling="2026-04-16 13:59:46.642608165 +0000 UTC m=+160.820857025" observedRunningTime="2026-04-16 13:59:46.803230984 +0000 UTC m=+160.981479853" watchObservedRunningTime="2026-04-16 13:59:46.803297463 +0000 UTC m=+160.981546335" Apr 16 13:59:46.818493 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:46.818422 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qsrlw" podStartSLOduration=1.6262955 podStartE2EDuration="4.81840675s" podCreationTimestamp="2026-04-16 13:59:42 +0000 UTC" firstStartedPulling="2026-04-16 13:59:43.444934843 +0000 UTC m=+157.623183689" lastFinishedPulling="2026-04-16 13:59:46.637046088 +0000 UTC m=+160.815294939" observedRunningTime="2026-04-16 13:59:46.817751651 +0000 UTC m=+160.996000520" watchObservedRunningTime="2026-04-16 13:59:46.81840675 +0000 UTC m=+160.996655618" Apr 16 13:59:47.157452 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:47.157412 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls\") pod \"image-registry-6b9d68db7f-fjmrf\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 13:59:47.157652 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:47.157559 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:59:47.157652 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:47.157582 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b9d68db7f-fjmrf: secret "image-registry-tls" not found Apr 16 13:59:47.157652 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:47.157642 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls podName:f6503505-8fc2-4e22-b559-ade573fe4d03 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:49.157626776 +0000 UTC m=+283.335875621 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls") pod "image-registry-6b9d68db7f-fjmrf" (UID: "f6503505-8fc2-4e22-b559-ade573fe4d03") : secret "image-registry-tls" not found Apr 16 13:59:47.258709 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:47.258617 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls\") pod \"dns-default-g5lb6\" (UID: \"92799234-6fff-45be-a27c-c70096483d30\") " pod="openshift-dns/dns-default-g5lb6" Apr 16 13:59:47.258870 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:47.258748 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert\") pod \"ingress-canary-hgrw6\" (UID: \"b77dfd1a-6bd6-449b-8db3-c93ff41eb18e\") " pod="openshift-ingress-canary/ingress-canary-hgrw6" Apr 16 13:59:47.258870 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:47.258772 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:47.258870 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:47.258836 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls podName:92799234-6fff-45be-a27c-c70096483d30 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:49.258817383 +0000 UTC m=+283.437066229 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls") pod "dns-default-g5lb6" (UID: "92799234-6fff-45be-a27c-c70096483d30") : secret "dns-default-metrics-tls" not found Apr 16 13:59:47.258870 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:47.258867 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:47.259009 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:47.258928 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert podName:b77dfd1a-6bd6-449b-8db3-c93ff41eb18e nodeName:}" failed. No retries permitted until 2026-04-16 14:01:49.258897629 +0000 UTC m=+283.437146495 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert") pod "ingress-canary-hgrw6" (UID: "b77dfd1a-6bd6-449b-8db3-c93ff41eb18e") : secret "canary-serving-cert" not found Apr 16 13:59:47.359810 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:47.359774 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e84621c2-6f3b-487c-8a13-426a6d91539c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6mm5s\" (UID: \"e84621c2-6f3b-487c-8a13-426a6d91539c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6mm5s" Apr 16 13:59:47.359976 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:47.359913 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 13:59:47.359976 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:47.359968 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e84621c2-6f3b-487c-8a13-426a6d91539c-networking-console-plugin-cert podName:e84621c2-6f3b-487c-8a13-426a6d91539c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:49.359954338 +0000 UTC m=+163.538203189 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e84621c2-6f3b-487c-8a13-426a6d91539c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-6mm5s" (UID: "e84621c2-6f3b-487c-8a13-426a6d91539c") : secret "networking-console-plugin-cert" not found Apr 16 13:59:47.798361 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:47.798332 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zxb7l_24046d5b-b6df-4005-85d5-01cafc82cc40/console-operator/1.log" Apr 16 13:59:47.798822 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:47.798781 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zxb7l_24046d5b-b6df-4005-85d5-01cafc82cc40/console-operator/0.log" Apr 16 13:59:47.798876 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:47.798822 2575 generic.go:358] "Generic (PLEG): container finished" podID="24046d5b-b6df-4005-85d5-01cafc82cc40" containerID="bc55c78044e5378200810912284afe5e4bd76820ca40f6d12d19cc3130a9c227" exitCode=255 Apr 16 13:59:47.798944 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:47.798918 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" event={"ID":"24046d5b-b6df-4005-85d5-01cafc82cc40","Type":"ContainerDied","Data":"bc55c78044e5378200810912284afe5e4bd76820ca40f6d12d19cc3130a9c227"} Apr 16 13:59:47.798998 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:47.798983 2575 scope.go:117] "RemoveContainer" containerID="b56d8dc02506898a418ee2367ac759399ee47e62cae1d98456a2b1796fa61a4d" Apr 16 13:59:47.799231 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:47.799157 2575 scope.go:117] "RemoveContainer" containerID="bc55c78044e5378200810912284afe5e4bd76820ca40f6d12d19cc3130a9c227" Apr 16 13:59:47.799394 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:47.799374 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-zxb7l_openshift-console-operator(24046d5b-b6df-4005-85d5-01cafc82cc40)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" podUID="24046d5b-b6df-4005-85d5-01cafc82cc40" Apr 16 13:59:48.330626 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.330593 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-8pp5x"] Apr 16 13:59:48.336172 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.336148 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-8pp5x" Apr 16 13:59:48.338660 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.338635 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:48.338660 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.338653 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 13:59:48.339320 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.339307 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-2bv5f\"" Apr 16 13:59:48.344556 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.344520 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-8pp5x"] Apr 16 13:59:48.369804 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.369768 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nspqv\" (UniqueName: \"kubernetes.io/projected/11464619-ce66-4292-8d50-8b67501941bf-kube-api-access-nspqv\") pod \"migrator-64d4d94569-8pp5x\" (UID: \"11464619-ce66-4292-8d50-8b67501941bf\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-8pp5x" Apr 16 13:59:48.470658 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.470618 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nspqv\" (UniqueName: \"kubernetes.io/projected/11464619-ce66-4292-8d50-8b67501941bf-kube-api-access-nspqv\") pod \"migrator-64d4d94569-8pp5x\" (UID: \"11464619-ce66-4292-8d50-8b67501941bf\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-8pp5x" Apr 16 13:59:48.478425 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.478393 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nspqv\" (UniqueName: \"kubernetes.io/projected/11464619-ce66-4292-8d50-8b67501941bf-kube-api-access-nspqv\") pod \"migrator-64d4d94569-8pp5x\" (UID: \"11464619-ce66-4292-8d50-8b67501941bf\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-8pp5x" Apr 16 13:59:48.645552 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.645525 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-8pp5x" Apr 16 13:59:48.672552 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.672518 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-nnvgs"] Apr 16 13:59:48.676693 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.676665 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-nnvgs" Apr 16 13:59:48.678976 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.678955 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 13:59:48.679209 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.679189 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 13:59:48.679331 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.679315 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 13:59:48.679411 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.679324 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 13:59:48.679473 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.679442 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-dj6pq\"" Apr 16 13:59:48.683361 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.683322 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-nnvgs"] Apr 16 13:59:48.766519 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.766473 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-8pp5x"] Apr 16 13:59:48.769907 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:59:48.769873 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11464619_ce66_4292_8d50_8b67501941bf.slice/crio-2334633eec1312f160ecfe22d9236757c8e5f7fd3bb1fade9e92d38fff5a1d9c WatchSource:0}: Error finding container 2334633eec1312f160ecfe22d9236757c8e5f7fd3bb1fade9e92d38fff5a1d9c: Status 404 returned error can't find the container with id 2334633eec1312f160ecfe22d9236757c8e5f7fd3bb1fade9e92d38fff5a1d9c Apr 16 13:59:48.773491 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.773464 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2krp\" (UniqueName: \"kubernetes.io/projected/63e272eb-cb21-46fd-a56c-015606a91604-kube-api-access-z2krp\") pod \"service-ca-bfc587fb7-nnvgs\" (UID: \"63e272eb-cb21-46fd-a56c-015606a91604\") " pod="openshift-service-ca/service-ca-bfc587fb7-nnvgs" Apr 16 13:59:48.773601 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.773525 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/63e272eb-cb21-46fd-a56c-015606a91604-signing-key\") pod \"service-ca-bfc587fb7-nnvgs\" (UID: \"63e272eb-cb21-46fd-a56c-015606a91604\") " pod="openshift-service-ca/service-ca-bfc587fb7-nnvgs" Apr 16 13:59:48.773660 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.773639 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/63e272eb-cb21-46fd-a56c-015606a91604-signing-cabundle\") pod \"service-ca-bfc587fb7-nnvgs\" (UID: \"63e272eb-cb21-46fd-a56c-015606a91604\") " pod="openshift-service-ca/service-ca-bfc587fb7-nnvgs" Apr 16 13:59:48.802525 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.802492 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-8pp5x" event={"ID":"11464619-ce66-4292-8d50-8b67501941bf","Type":"ContainerStarted","Data":"2334633eec1312f160ecfe22d9236757c8e5f7fd3bb1fade9e92d38fff5a1d9c"} Apr 16 13:59:48.803704 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.803679 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zxb7l_24046d5b-b6df-4005-85d5-01cafc82cc40/console-operator/1.log" Apr 16 13:59:48.804050 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.804035 2575 scope.go:117] "RemoveContainer" containerID="bc55c78044e5378200810912284afe5e4bd76820ca40f6d12d19cc3130a9c227" Apr 16 13:59:48.804212 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:48.804194 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-zxb7l_openshift-console-operator(24046d5b-b6df-4005-85d5-01cafc82cc40)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" podUID="24046d5b-b6df-4005-85d5-01cafc82cc40" Apr 16 13:59:48.874970 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.874935 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/63e272eb-cb21-46fd-a56c-015606a91604-signing-cabundle\") pod \"service-ca-bfc587fb7-nnvgs\" (UID: \"63e272eb-cb21-46fd-a56c-015606a91604\") " pod="openshift-service-ca/service-ca-bfc587fb7-nnvgs" Apr 16 13:59:48.875153 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.875016 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2krp\" (UniqueName: \"kubernetes.io/projected/63e272eb-cb21-46fd-a56c-015606a91604-kube-api-access-z2krp\") pod \"service-ca-bfc587fb7-nnvgs\" (UID: \"63e272eb-cb21-46fd-a56c-015606a91604\") " pod="openshift-service-ca/service-ca-bfc587fb7-nnvgs" Apr 16 13:59:48.875212 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.875189 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/63e272eb-cb21-46fd-a56c-015606a91604-signing-key\") pod \"service-ca-bfc587fb7-nnvgs\" (UID: \"63e272eb-cb21-46fd-a56c-015606a91604\") " pod="openshift-service-ca/service-ca-bfc587fb7-nnvgs" Apr 16 13:59:48.875826 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.875806 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/63e272eb-cb21-46fd-a56c-015606a91604-signing-cabundle\") pod \"service-ca-bfc587fb7-nnvgs\" (UID: \"63e272eb-cb21-46fd-a56c-015606a91604\") " pod="openshift-service-ca/service-ca-bfc587fb7-nnvgs" Apr 16 13:59:48.877542 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.877515 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/63e272eb-cb21-46fd-a56c-015606a91604-signing-key\") pod \"service-ca-bfc587fb7-nnvgs\" (UID: \"63e272eb-cb21-46fd-a56c-015606a91604\") " pod="openshift-service-ca/service-ca-bfc587fb7-nnvgs" Apr 16 13:59:48.883687 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.883665 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2krp\" (UniqueName: \"kubernetes.io/projected/63e272eb-cb21-46fd-a56c-015606a91604-kube-api-access-z2krp\") pod \"service-ca-bfc587fb7-nnvgs\" (UID: \"63e272eb-cb21-46fd-a56c-015606a91604\") " pod="openshift-service-ca/service-ca-bfc587fb7-nnvgs" Apr 16 13:59:48.987261 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:48.987176 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-nnvgs" Apr 16 13:59:49.104135 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:49.104098 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-nnvgs"] Apr 16 13:59:49.108678 ip-10-0-141-131 kubenswrapper[2575]: W0416 13:59:49.108641 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63e272eb_cb21_46fd_a56c_015606a91604.slice/crio-bc66e989803649218365f68d6197d07f524b80d323ba50ae5df53f6850d9f204 WatchSource:0}: Error finding container bc66e989803649218365f68d6197d07f524b80d323ba50ae5df53f6850d9f204: Status 404 returned error can't find the container with id bc66e989803649218365f68d6197d07f524b80d323ba50ae5df53f6850d9f204 Apr 16 13:59:49.378573 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:49.378533 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e84621c2-6f3b-487c-8a13-426a6d91539c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6mm5s\" (UID: \"e84621c2-6f3b-487c-8a13-426a6d91539c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6mm5s" Apr 16 13:59:49.378804 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:49.378781 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 13:59:49.378879 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:49.378870 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e84621c2-6f3b-487c-8a13-426a6d91539c-networking-console-plugin-cert podName:e84621c2-6f3b-487c-8a13-426a6d91539c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:53.37884685 +0000 UTC m=+167.557095709 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e84621c2-6f3b-487c-8a13-426a6d91539c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-6mm5s" (UID: "e84621c2-6f3b-487c-8a13-426a6d91539c") : secret "networking-console-plugin-cert" not found Apr 16 13:59:49.807484 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:49.807387 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-nnvgs" event={"ID":"63e272eb-cb21-46fd-a56c-015606a91604","Type":"ContainerStarted","Data":"bc66e989803649218365f68d6197d07f524b80d323ba50ae5df53f6850d9f204"} Apr 16 13:59:50.283862 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:50.283836 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hvqjp_1b680da6-ab85-4c31-98d8-35be4b07624b/dns-node-resolver/0.log" Apr 16 13:59:50.691374 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:50.691331 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/710be765-3a16-444a-b531-1c251066d20c-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-98npk\" (UID: \"710be765-3a16-444a-b531-1c251066d20c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-98npk" Apr 16 13:59:50.691556 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:50.691513 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 13:59:50.691613 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:50.691595 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/710be765-3a16-444a-b531-1c251066d20c-samples-operator-tls podName:710be765-3a16-444a-b531-1c251066d20c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:58.691572566 +0000 UTC m=+172.869821422 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/710be765-3a16-444a-b531-1c251066d20c-samples-operator-tls") pod "cluster-samples-operator-667775844f-98npk" (UID: "710be765-3a16-444a-b531-1c251066d20c") : secret "samples-operator-tls" not found Apr 16 13:59:50.812112 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:50.812079 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-8pp5x" event={"ID":"11464619-ce66-4292-8d50-8b67501941bf","Type":"ContainerStarted","Data":"d7d0ba65df7800e74dca79f94a015f35f7aef65c7e555b1799546ac7028fc9a9"} Apr 16 13:59:50.812112 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:50.812114 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-8pp5x" event={"ID":"11464619-ce66-4292-8d50-8b67501941bf","Type":"ContainerStarted","Data":"b75c852363eb1fded3be05d46986f58beae4d3a9a5932c96254fe495d9e18a0d"} Apr 16 13:59:50.830913 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:50.830861 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-8pp5x" podStartSLOduration=1.5044563279999998 podStartE2EDuration="2.830842682s" podCreationTimestamp="2026-04-16 13:59:48 +0000 UTC" firstStartedPulling="2026-04-16 13:59:48.77186642 +0000 UTC m=+162.950115267" lastFinishedPulling="2026-04-16 13:59:50.09825277 +0000 UTC m=+164.276501621" observedRunningTime="2026-04-16 13:59:50.829029522 +0000 UTC m=+165.007278391" watchObservedRunningTime="2026-04-16 13:59:50.830842682 +0000 UTC m=+165.009091550" Apr 16 13:59:51.081863 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:51.081787 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-89jl9_cb697396-e88e-4780-9f6a-2109bfc21e0f/node-ca/0.log" Apr 16 13:59:51.816522 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:51.816481 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-nnvgs" event={"ID":"63e272eb-cb21-46fd-a56c-015606a91604","Type":"ContainerStarted","Data":"a9c65015c210faf0c2a746dc06d0c0202c5a3be905501438e09561395124ec4d"} Apr 16 13:59:51.831399 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:51.831331 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-nnvgs" podStartSLOduration=2.128785654 podStartE2EDuration="3.831311281s" podCreationTimestamp="2026-04-16 13:59:48 +0000 UTC" firstStartedPulling="2026-04-16 13:59:49.110451147 +0000 UTC m=+163.288699994" lastFinishedPulling="2026-04-16 13:59:50.812976774 +0000 UTC m=+164.991225621" observedRunningTime="2026-04-16 13:59:51.830885282 +0000 UTC m=+166.009134150" watchObservedRunningTime="2026-04-16 13:59:51.831311281 +0000 UTC m=+166.009560153" Apr 16 13:59:53.209810 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:53.209776 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" Apr 16 13:59:53.210272 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:53.209825 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" Apr 16 13:59:53.214665 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:53.210691 2575 scope.go:117] "RemoveContainer" containerID="bc55c78044e5378200810912284afe5e4bd76820ca40f6d12d19cc3130a9c227" Apr 16 13:59:53.214665 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:53.211091 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-zxb7l_openshift-console-operator(24046d5b-b6df-4005-85d5-01cafc82cc40)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" podUID="24046d5b-b6df-4005-85d5-01cafc82cc40" Apr 16 13:59:53.417512 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:53.417472 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e84621c2-6f3b-487c-8a13-426a6d91539c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6mm5s\" (UID: \"e84621c2-6f3b-487c-8a13-426a6d91539c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6mm5s" Apr 16 13:59:53.417696 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:53.417605 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 13:59:53.417696 ip-10-0-141-131 kubenswrapper[2575]: E0416 13:59:53.417675 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e84621c2-6f3b-487c-8a13-426a6d91539c-networking-console-plugin-cert podName:e84621c2-6f3b-487c-8a13-426a6d91539c nodeName:}" failed. No retries permitted until 2026-04-16 14:00:01.417659214 +0000 UTC m=+175.595908061 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e84621c2-6f3b-487c-8a13-426a6d91539c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-6mm5s" (UID: "e84621c2-6f3b-487c-8a13-426a6d91539c") : secret "networking-console-plugin-cert" not found Apr 16 13:59:56.387525 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:56.387439 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 13:59:58.763778 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:58.763714 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/710be765-3a16-444a-b531-1c251066d20c-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-98npk\" (UID: \"710be765-3a16-444a-b531-1c251066d20c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-98npk" Apr 16 13:59:58.766228 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:58.766200 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/710be765-3a16-444a-b531-1c251066d20c-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-98npk\" (UID: \"710be765-3a16-444a-b531-1c251066d20c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-98npk" Apr 16 13:59:58.904492 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:58.904445 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-98npk" Apr 16 13:59:59.028316 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:59.027430 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-98npk"] Apr 16 13:59:59.841156 ip-10-0-141-131 kubenswrapper[2575]: I0416 13:59:59.841120 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-98npk" event={"ID":"710be765-3a16-444a-b531-1c251066d20c","Type":"ContainerStarted","Data":"24fc9e8c0e498cbbd0ed10f6be91d3610c452551257dc3936aa83adf7d75fa4e"} Apr 16 14:00:00.844747 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:00.844694 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-98npk" event={"ID":"710be765-3a16-444a-b531-1c251066d20c","Type":"ContainerStarted","Data":"5a4c563fddfa0d1714243f95ba3af72f3c8994f2cf76aa179faaa68fe01a0e98"} Apr 16 14:00:01.489773 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:01.489714 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e84621c2-6f3b-487c-8a13-426a6d91539c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6mm5s\" (UID: \"e84621c2-6f3b-487c-8a13-426a6d91539c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6mm5s" Apr 16 14:00:01.492326 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:01.492301 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e84621c2-6f3b-487c-8a13-426a6d91539c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-6mm5s\" (UID: \"e84621c2-6f3b-487c-8a13-426a6d91539c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6mm5s" Apr 16 14:00:01.583270 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:01.583229 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6mm5s" Apr 16 14:00:01.702867 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:01.702832 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-6mm5s"] Apr 16 14:00:01.706481 ip-10-0-141-131 kubenswrapper[2575]: W0416 14:00:01.706435 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode84621c2_6f3b_487c_8a13_426a6d91539c.slice/crio-180b0cd1a898e3206823249a2d5f6b112081bb2a42244f6eb47a500194a32bd3 WatchSource:0}: Error finding container 180b0cd1a898e3206823249a2d5f6b112081bb2a42244f6eb47a500194a32bd3: Status 404 returned error can't find the container with id 180b0cd1a898e3206823249a2d5f6b112081bb2a42244f6eb47a500194a32bd3 Apr 16 14:00:01.848126 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:01.848031 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6mm5s" event={"ID":"e84621c2-6f3b-487c-8a13-426a6d91539c","Type":"ContainerStarted","Data":"180b0cd1a898e3206823249a2d5f6b112081bb2a42244f6eb47a500194a32bd3"} Apr 16 14:00:01.849544 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:01.849520 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-98npk" event={"ID":"710be765-3a16-444a-b531-1c251066d20c","Type":"ContainerStarted","Data":"b4cde7b4bf023da5098f1a0762828e1cff8b79a77bf4d32b5293ed537ffafbf7"} Apr 16 14:00:01.866973 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:01.866920 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-98npk" podStartSLOduration=18.163441292999998 podStartE2EDuration="19.866903835s" podCreationTimestamp="2026-04-16 13:59:42 +0000 UTC" firstStartedPulling="2026-04-16 13:59:59.077702534 +0000 UTC m=+173.255951379" lastFinishedPulling="2026-04-16 14:00:00.781165075 +0000 UTC m=+174.959413921" observedRunningTime="2026-04-16 14:00:01.866311449 +0000 UTC m=+176.044560330" watchObservedRunningTime="2026-04-16 14:00:01.866903835 +0000 UTC m=+176.045152703" Apr 16 14:00:04.857966 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:04.857934 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6mm5s" event={"ID":"e84621c2-6f3b-487c-8a13-426a6d91539c","Type":"ContainerStarted","Data":"e5d0199cbc2a7bfdf92fd3969e9317e54786c70d5d0229b6f2a65288d4dc5ca3"} Apr 16 14:00:07.386457 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:07.386422 2575 scope.go:117] "RemoveContainer" containerID="bc55c78044e5378200810912284afe5e4bd76820ca40f6d12d19cc3130a9c227" Apr 16 14:00:07.866575 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:07.866546 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zxb7l_24046d5b-b6df-4005-85d5-01cafc82cc40/console-operator/2.log" Apr 16 14:00:07.866921 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:07.866906 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zxb7l_24046d5b-b6df-4005-85d5-01cafc82cc40/console-operator/1.log" Apr 16 14:00:07.866978 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:07.866938 2575 generic.go:358] "Generic (PLEG): container finished" podID="24046d5b-b6df-4005-85d5-01cafc82cc40" containerID="53104a9ef0427c53337652dbcddd1d653880e70e91a6e09f88461acfaa6071c4" exitCode=255 Apr 16 14:00:07.867021 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:07.866982 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" event={"ID":"24046d5b-b6df-4005-85d5-01cafc82cc40","Type":"ContainerDied","Data":"53104a9ef0427c53337652dbcddd1d653880e70e91a6e09f88461acfaa6071c4"} Apr 16 14:00:07.867021 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:07.867010 2575 scope.go:117] "RemoveContainer" containerID="bc55c78044e5378200810912284afe5e4bd76820ca40f6d12d19cc3130a9c227" Apr 16 14:00:07.867332 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:07.867316 2575 scope.go:117] "RemoveContainer" containerID="53104a9ef0427c53337652dbcddd1d653880e70e91a6e09f88461acfaa6071c4" Apr 16 14:00:07.867516 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:00:07.867499 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-zxb7l_openshift-console-operator(24046d5b-b6df-4005-85d5-01cafc82cc40)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" podUID="24046d5b-b6df-4005-85d5-01cafc82cc40" Apr 16 14:00:07.885272 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:07.885231 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-6mm5s" podStartSLOduration=20.719014007 podStartE2EDuration="22.885218314s" podCreationTimestamp="2026-04-16 13:59:45 +0000 UTC" firstStartedPulling="2026-04-16 14:00:01.708300135 +0000 UTC m=+175.886548981" lastFinishedPulling="2026-04-16 14:00:03.874504442 +0000 UTC m=+178.052753288" observedRunningTime="2026-04-16 14:00:04.875856186 +0000 UTC m=+179.054105053" watchObservedRunningTime="2026-04-16 14:00:07.885218314 +0000 UTC m=+182.063467223" Apr 16 14:00:08.871170 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:08.871141 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zxb7l_24046d5b-b6df-4005-85d5-01cafc82cc40/console-operator/2.log" Apr 16 14:00:11.403082 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:11.403053 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-kjxps"] Apr 16 14:00:11.464366 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:11.464327 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kjxps"] Apr 16 14:00:11.464530 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:11.464460 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kjxps" Apr 16 14:00:11.467209 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:11.467187 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:00:11.467209 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:11.467203 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:00:11.468106 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:11.468088 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vbnth\"" Apr 16 14:00:11.571666 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:11.571625 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/39530f63-5235-4a51-ade7-9af0431be21f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kjxps\" (UID: \"39530f63-5235-4a51-ade7-9af0431be21f\") " pod="openshift-insights/insights-runtime-extractor-kjxps" Apr 16 14:00:11.571666 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:11.571667 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/39530f63-5235-4a51-ade7-9af0431be21f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kjxps\" (UID: \"39530f63-5235-4a51-ade7-9af0431be21f\") " pod="openshift-insights/insights-runtime-extractor-kjxps" Apr 16 14:00:11.571912 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:11.571719 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htjph\" (UniqueName: \"kubernetes.io/projected/39530f63-5235-4a51-ade7-9af0431be21f-kube-api-access-htjph\") pod \"insights-runtime-extractor-kjxps\" (UID: \"39530f63-5235-4a51-ade7-9af0431be21f\") " pod="openshift-insights/insights-runtime-extractor-kjxps" Apr 16 14:00:11.571912 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:11.571825 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/39530f63-5235-4a51-ade7-9af0431be21f-data-volume\") pod \"insights-runtime-extractor-kjxps\" (UID: \"39530f63-5235-4a51-ade7-9af0431be21f\") " pod="openshift-insights/insights-runtime-extractor-kjxps" Apr 16 14:00:11.571912 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:11.571866 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/39530f63-5235-4a51-ade7-9af0431be21f-crio-socket\") pod \"insights-runtime-extractor-kjxps\" (UID: \"39530f63-5235-4a51-ade7-9af0431be21f\") " pod="openshift-insights/insights-runtime-extractor-kjxps" Apr 16 14:00:11.673091 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:11.672999 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/39530f63-5235-4a51-ade7-9af0431be21f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kjxps\" (UID: \"39530f63-5235-4a51-ade7-9af0431be21f\") " pod="openshift-insights/insights-runtime-extractor-kjxps" Apr 16 14:00:11.673091 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:11.673038 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/39530f63-5235-4a51-ade7-9af0431be21f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kjxps\" (UID: \"39530f63-5235-4a51-ade7-9af0431be21f\") " pod="openshift-insights/insights-runtime-extractor-kjxps" Apr 16 14:00:11.673091 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:11.673064 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-htjph\" (UniqueName: \"kubernetes.io/projected/39530f63-5235-4a51-ade7-9af0431be21f-kube-api-access-htjph\") pod \"insights-runtime-extractor-kjxps\" (UID: \"39530f63-5235-4a51-ade7-9af0431be21f\") " pod="openshift-insights/insights-runtime-extractor-kjxps" Apr 16 14:00:11.673350 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:11.673143 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/39530f63-5235-4a51-ade7-9af0431be21f-data-volume\") pod \"insights-runtime-extractor-kjxps\" (UID: \"39530f63-5235-4a51-ade7-9af0431be21f\") " pod="openshift-insights/insights-runtime-extractor-kjxps" Apr 16 14:00:11.673350 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:11.673207 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/39530f63-5235-4a51-ade7-9af0431be21f-crio-socket\") pod \"insights-runtime-extractor-kjxps\" (UID: \"39530f63-5235-4a51-ade7-9af0431be21f\") " pod="openshift-insights/insights-runtime-extractor-kjxps" Apr 16 14:00:11.673350 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:11.673314 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/39530f63-5235-4a51-ade7-9af0431be21f-crio-socket\") pod \"insights-runtime-extractor-kjxps\" (UID: \"39530f63-5235-4a51-ade7-9af0431be21f\") " pod="openshift-insights/insights-runtime-extractor-kjxps" Apr 16 14:00:11.673571 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:11.673549 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/39530f63-5235-4a51-ade7-9af0431be21f-data-volume\") pod \"insights-runtime-extractor-kjxps\" (UID: \"39530f63-5235-4a51-ade7-9af0431be21f\") " pod="openshift-insights/insights-runtime-extractor-kjxps" Apr 16 14:00:11.673634 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:11.673553 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/39530f63-5235-4a51-ade7-9af0431be21f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kjxps\" (UID: \"39530f63-5235-4a51-ade7-9af0431be21f\") " pod="openshift-insights/insights-runtime-extractor-kjxps" Apr 16 14:00:11.675491 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:11.675467 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/39530f63-5235-4a51-ade7-9af0431be21f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kjxps\" (UID: \"39530f63-5235-4a51-ade7-9af0431be21f\") " pod="openshift-insights/insights-runtime-extractor-kjxps" Apr 16 14:00:11.683133 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:11.683101 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-htjph\" (UniqueName: \"kubernetes.io/projected/39530f63-5235-4a51-ade7-9af0431be21f-kube-api-access-htjph\") pod \"insights-runtime-extractor-kjxps\" (UID: \"39530f63-5235-4a51-ade7-9af0431be21f\") " pod="openshift-insights/insights-runtime-extractor-kjxps" Apr 16 14:00:11.773135 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:11.773099 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kjxps" Apr 16 14:00:11.885848 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:11.885815 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kjxps"] Apr 16 14:00:11.888823 ip-10-0-141-131 kubenswrapper[2575]: W0416 14:00:11.888796 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39530f63_5235_4a51_ade7_9af0431be21f.slice/crio-4bd17ea7046f6ab2d394dfa8c5c2e5ea7631c12400a67669396101460050ce66 WatchSource:0}: Error finding container 4bd17ea7046f6ab2d394dfa8c5c2e5ea7631c12400a67669396101460050ce66: Status 404 returned error can't find the container with id 4bd17ea7046f6ab2d394dfa8c5c2e5ea7631c12400a67669396101460050ce66 Apr 16 14:00:12.881986 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:12.881941 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kjxps" event={"ID":"39530f63-5235-4a51-ade7-9af0431be21f","Type":"ContainerStarted","Data":"5a040cdd7b98ec3ae4d587521568f0df12174db5bea2e95e014f63ce26d99f0f"} Apr 16 14:00:12.881986 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:12.881980 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kjxps" event={"ID":"39530f63-5235-4a51-ade7-9af0431be21f","Type":"ContainerStarted","Data":"4bd17ea7046f6ab2d394dfa8c5c2e5ea7631c12400a67669396101460050ce66"} Apr 16 14:00:13.209643 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:13.209558 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" Apr 16 14:00:13.209643 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:13.209611 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" Apr 16 14:00:13.210055 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:13.210037 2575 scope.go:117] "RemoveContainer" containerID="53104a9ef0427c53337652dbcddd1d653880e70e91a6e09f88461acfaa6071c4" Apr 16 14:00:13.210249 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:00:13.210232 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-zxb7l_openshift-console-operator(24046d5b-b6df-4005-85d5-01cafc82cc40)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" podUID="24046d5b-b6df-4005-85d5-01cafc82cc40" Apr 16 14:00:13.886241 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:13.886202 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kjxps" event={"ID":"39530f63-5235-4a51-ade7-9af0431be21f","Type":"ContainerStarted","Data":"2cc6cc8b14b1961beae4aad134ab3bffd12754eae8c60ae457ae8e7b955b880f"} Apr 16 14:00:13.886613 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:13.886482 2575 scope.go:117] "RemoveContainer" containerID="53104a9ef0427c53337652dbcddd1d653880e70e91a6e09f88461acfaa6071c4" Apr 16 14:00:13.886665 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:00:13.886647 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-zxb7l_openshift-console-operator(24046d5b-b6df-4005-85d5-01cafc82cc40)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" podUID="24046d5b-b6df-4005-85d5-01cafc82cc40" Apr 16 14:00:15.893612 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:15.893579 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kjxps" event={"ID":"39530f63-5235-4a51-ade7-9af0431be21f","Type":"ContainerStarted","Data":"c4650032e534d74f03f4a9f8eca86855ba9b630e80d0a9343469817d54ae6361"} Apr 16 14:00:15.912592 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:15.912538 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-kjxps" podStartSLOduration=1.371189252 podStartE2EDuration="4.912523914s" podCreationTimestamp="2026-04-16 14:00:11 +0000 UTC" firstStartedPulling="2026-04-16 14:00:12.028449049 +0000 UTC m=+186.206697908" lastFinishedPulling="2026-04-16 14:00:15.569783709 +0000 UTC m=+189.748032570" observedRunningTime="2026-04-16 14:00:15.911983119 +0000 UTC m=+190.090231987" watchObservedRunningTime="2026-04-16 14:00:15.912523914 +0000 UTC m=+190.090772782" Apr 16 14:00:20.633235 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:20.633195 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-xhcvf"] Apr 16 14:00:20.636547 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:20.636529 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-xhcvf" Apr 16 14:00:20.639026 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:20.639004 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 14:00:20.639286 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:20.639266 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:00:20.639391 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:20.639266 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:00:20.639391 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:20.639273 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 14:00:20.639489 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:20.639273 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-5hcxn\"" Apr 16 14:00:20.639489 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:20.639274 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:00:20.645757 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:20.645717 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-xhcvf"] Apr 16 14:00:20.748521 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:20.748484 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ea940c5e-d3a3-4fb9-8758-aaada6c8070d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-xhcvf\" (UID: \"ea940c5e-d3a3-4fb9-8758-aaada6c8070d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xhcvf" Apr 16 14:00:20.748521 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:20.748528 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea940c5e-d3a3-4fb9-8758-aaada6c8070d-metrics-client-ca\") pod \"prometheus-operator-78f957474d-xhcvf\" (UID: \"ea940c5e-d3a3-4fb9-8758-aaada6c8070d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xhcvf" Apr 16 14:00:20.748817 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:20.748627 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ea940c5e-d3a3-4fb9-8758-aaada6c8070d-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-xhcvf\" (UID: \"ea940c5e-d3a3-4fb9-8758-aaada6c8070d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xhcvf" Apr 16 14:00:20.748817 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:20.748776 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc8jf\" (UniqueName: \"kubernetes.io/projected/ea940c5e-d3a3-4fb9-8758-aaada6c8070d-kube-api-access-gc8jf\") pod \"prometheus-operator-78f957474d-xhcvf\" (UID: \"ea940c5e-d3a3-4fb9-8758-aaada6c8070d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xhcvf" Apr 16 14:00:20.849487 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:20.849456 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gc8jf\" (UniqueName: \"kubernetes.io/projected/ea940c5e-d3a3-4fb9-8758-aaada6c8070d-kube-api-access-gc8jf\") pod \"prometheus-operator-78f957474d-xhcvf\" (UID: \"ea940c5e-d3a3-4fb9-8758-aaada6c8070d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xhcvf" Apr 16 14:00:20.849628 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:20.849493 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ea940c5e-d3a3-4fb9-8758-aaada6c8070d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-xhcvf\" (UID: \"ea940c5e-d3a3-4fb9-8758-aaada6c8070d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xhcvf" Apr 16 14:00:20.849628 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:20.849516 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea940c5e-d3a3-4fb9-8758-aaada6c8070d-metrics-client-ca\") pod \"prometheus-operator-78f957474d-xhcvf\" (UID: \"ea940c5e-d3a3-4fb9-8758-aaada6c8070d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xhcvf" Apr 16 14:00:20.849628 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:20.849563 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ea940c5e-d3a3-4fb9-8758-aaada6c8070d-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-xhcvf\" (UID: \"ea940c5e-d3a3-4fb9-8758-aaada6c8070d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xhcvf" Apr 16 14:00:20.850299 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:20.850277 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea940c5e-d3a3-4fb9-8758-aaada6c8070d-metrics-client-ca\") pod \"prometheus-operator-78f957474d-xhcvf\" (UID: \"ea940c5e-d3a3-4fb9-8758-aaada6c8070d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xhcvf" Apr 16 14:00:20.851989 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:20.851957 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ea940c5e-d3a3-4fb9-8758-aaada6c8070d-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-xhcvf\" (UID: \"ea940c5e-d3a3-4fb9-8758-aaada6c8070d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xhcvf" Apr 16 14:00:20.852089 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:20.852060 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ea940c5e-d3a3-4fb9-8758-aaada6c8070d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-xhcvf\" (UID: \"ea940c5e-d3a3-4fb9-8758-aaada6c8070d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xhcvf" Apr 16 14:00:20.857852 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:20.857833 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc8jf\" (UniqueName: \"kubernetes.io/projected/ea940c5e-d3a3-4fb9-8758-aaada6c8070d-kube-api-access-gc8jf\") pod \"prometheus-operator-78f957474d-xhcvf\" (UID: \"ea940c5e-d3a3-4fb9-8758-aaada6c8070d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xhcvf" Apr 16 14:00:20.947530 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:20.947446 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-xhcvf" Apr 16 14:00:21.064169 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:21.064139 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-xhcvf"] Apr 16 14:00:21.067581 ip-10-0-141-131 kubenswrapper[2575]: W0416 14:00:21.067551 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea940c5e_d3a3_4fb9_8758_aaada6c8070d.slice/crio-acbee570402f644bfcf6bb6c0d5fa2cc1682ab8e809a423fe6b717edcac863b4 WatchSource:0}: Error finding container acbee570402f644bfcf6bb6c0d5fa2cc1682ab8e809a423fe6b717edcac863b4: Status 404 returned error can't find the container with id acbee570402f644bfcf6bb6c0d5fa2cc1682ab8e809a423fe6b717edcac863b4 Apr 16 14:00:21.908518 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:21.908483 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-xhcvf" event={"ID":"ea940c5e-d3a3-4fb9-8758-aaada6c8070d","Type":"ContainerStarted","Data":"acbee570402f644bfcf6bb6c0d5fa2cc1682ab8e809a423fe6b717edcac863b4"} Apr 16 14:00:23.915475 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:23.915435 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-xhcvf" event={"ID":"ea940c5e-d3a3-4fb9-8758-aaada6c8070d","Type":"ContainerStarted","Data":"8767f0b58dcbffbfb61dd8177e6bf2aee6db86044deae38ae2c3928a6b0ea63e"} Apr 16 14:00:23.915475 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:23.915474 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-xhcvf" event={"ID":"ea940c5e-d3a3-4fb9-8758-aaada6c8070d","Type":"ContainerStarted","Data":"9277a1264b1aca25b98c6bd86b14d0efc22e4ec5bc251c8e94e08abdae2e6c02"} Apr 16 14:00:23.932402 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:23.932353 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-xhcvf" podStartSLOduration=1.638690081 podStartE2EDuration="3.932338835s" podCreationTimestamp="2026-04-16 14:00:20 +0000 UTC" firstStartedPulling="2026-04-16 14:00:21.069539556 +0000 UTC m=+195.247788407" lastFinishedPulling="2026-04-16 14:00:23.363188312 +0000 UTC m=+197.541437161" observedRunningTime="2026-04-16 14:00:23.931130802 +0000 UTC m=+198.109379671" watchObservedRunningTime="2026-04-16 14:00:23.932338835 +0000 UTC m=+198.110587704" Apr 16 14:00:24.390016 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:24.389985 2575 scope.go:117] "RemoveContainer" containerID="53104a9ef0427c53337652dbcddd1d653880e70e91a6e09f88461acfaa6071c4" Apr 16 14:00:24.390177 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:00:24.390154 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-zxb7l_openshift-console-operator(24046d5b-b6df-4005-85d5-01cafc82cc40)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" podUID="24046d5b-b6df-4005-85d5-01cafc82cc40" Apr 16 14:00:26.006765 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.006716 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-4dhrq"] Apr 16 14:00:26.013758 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.013712 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.017183 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.016798 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2gbx5\"" Apr 16 14:00:26.017183 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.017113 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:00:26.017370 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.017314 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:00:26.017579 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.017437 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:00:26.194044 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.194013 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b68sp\" (UniqueName: \"kubernetes.io/projected/21d470a0-3408-42d5-a74e-d66c383570a4-kube-api-access-b68sp\") pod \"node-exporter-4dhrq\" (UID: \"21d470a0-3408-42d5-a74e-d66c383570a4\") " pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.194201 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.194076 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/21d470a0-3408-42d5-a74e-d66c383570a4-node-exporter-tls\") pod \"node-exporter-4dhrq\" (UID: \"21d470a0-3408-42d5-a74e-d66c383570a4\") " pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.194201 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.194102 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/21d470a0-3408-42d5-a74e-d66c383570a4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4dhrq\" (UID: \"21d470a0-3408-42d5-a74e-d66c383570a4\") " pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.194201 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.194128 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/21d470a0-3408-42d5-a74e-d66c383570a4-root\") pod \"node-exporter-4dhrq\" (UID: \"21d470a0-3408-42d5-a74e-d66c383570a4\") " pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.194201 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.194155 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/21d470a0-3408-42d5-a74e-d66c383570a4-sys\") pod \"node-exporter-4dhrq\" (UID: \"21d470a0-3408-42d5-a74e-d66c383570a4\") " pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.194201 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.194174 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/21d470a0-3408-42d5-a74e-d66c383570a4-node-exporter-textfile\") pod \"node-exporter-4dhrq\" (UID: \"21d470a0-3408-42d5-a74e-d66c383570a4\") " pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.194201 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.194190 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/21d470a0-3408-42d5-a74e-d66c383570a4-metrics-client-ca\") pod \"node-exporter-4dhrq\" (UID: \"21d470a0-3408-42d5-a74e-d66c383570a4\") " pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.194428 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.194251 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/21d470a0-3408-42d5-a74e-d66c383570a4-node-exporter-wtmp\") pod \"node-exporter-4dhrq\" (UID: \"21d470a0-3408-42d5-a74e-d66c383570a4\") " pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.194428 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.194325 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/21d470a0-3408-42d5-a74e-d66c383570a4-node-exporter-accelerators-collector-config\") pod \"node-exporter-4dhrq\" (UID: \"21d470a0-3408-42d5-a74e-d66c383570a4\") " pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.294799 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.294684 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/21d470a0-3408-42d5-a74e-d66c383570a4-root\") pod \"node-exporter-4dhrq\" (UID: \"21d470a0-3408-42d5-a74e-d66c383570a4\") " pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.294799 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.294753 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/21d470a0-3408-42d5-a74e-d66c383570a4-sys\") pod \"node-exporter-4dhrq\" (UID: \"21d470a0-3408-42d5-a74e-d66c383570a4\") " pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.294799 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.294777 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/21d470a0-3408-42d5-a74e-d66c383570a4-node-exporter-textfile\") pod \"node-exporter-4dhrq\" (UID: \"21d470a0-3408-42d5-a74e-d66c383570a4\") " pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.294799 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.294793 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/21d470a0-3408-42d5-a74e-d66c383570a4-root\") pod \"node-exporter-4dhrq\" (UID: \"21d470a0-3408-42d5-a74e-d66c383570a4\") " pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.294799 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.294802 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/21d470a0-3408-42d5-a74e-d66c383570a4-metrics-client-ca\") pod \"node-exporter-4dhrq\" (UID: \"21d470a0-3408-42d5-a74e-d66c383570a4\") " pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.295164 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.294872 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/21d470a0-3408-42d5-a74e-d66c383570a4-sys\") pod \"node-exporter-4dhrq\" (UID: \"21d470a0-3408-42d5-a74e-d66c383570a4\") " pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.295164 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.294889 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/21d470a0-3408-42d5-a74e-d66c383570a4-node-exporter-wtmp\") pod \"node-exporter-4dhrq\" (UID: \"21d470a0-3408-42d5-a74e-d66c383570a4\") " pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.295164 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.294951 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/21d470a0-3408-42d5-a74e-d66c383570a4-node-exporter-accelerators-collector-config\") pod \"node-exporter-4dhrq\" (UID: \"21d470a0-3408-42d5-a74e-d66c383570a4\") " pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.295164 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.294988 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b68sp\" (UniqueName: \"kubernetes.io/projected/21d470a0-3408-42d5-a74e-d66c383570a4-kube-api-access-b68sp\") pod \"node-exporter-4dhrq\" (UID: \"21d470a0-3408-42d5-a74e-d66c383570a4\") " pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.295164 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.294996 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/21d470a0-3408-42d5-a74e-d66c383570a4-node-exporter-wtmp\") pod \"node-exporter-4dhrq\" (UID: \"21d470a0-3408-42d5-a74e-d66c383570a4\") " pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.295164 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.295072 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/21d470a0-3408-42d5-a74e-d66c383570a4-node-exporter-tls\") pod \"node-exporter-4dhrq\" (UID: \"21d470a0-3408-42d5-a74e-d66c383570a4\") " pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.295164 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.295097 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/21d470a0-3408-42d5-a74e-d66c383570a4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4dhrq\" (UID: \"21d470a0-3408-42d5-a74e-d66c383570a4\") " pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.295164 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.295135 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/21d470a0-3408-42d5-a74e-d66c383570a4-node-exporter-textfile\") pod \"node-exporter-4dhrq\" (UID: \"21d470a0-3408-42d5-a74e-d66c383570a4\") " pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.295530 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.295430 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/21d470a0-3408-42d5-a74e-d66c383570a4-metrics-client-ca\") pod \"node-exporter-4dhrq\" (UID: \"21d470a0-3408-42d5-a74e-d66c383570a4\") " pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.295530 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.295439 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/21d470a0-3408-42d5-a74e-d66c383570a4-node-exporter-accelerators-collector-config\") pod \"node-exporter-4dhrq\" (UID: \"21d470a0-3408-42d5-a74e-d66c383570a4\") " pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.297359 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.297340 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/21d470a0-3408-42d5-a74e-d66c383570a4-node-exporter-tls\") pod \"node-exporter-4dhrq\" (UID: \"21d470a0-3408-42d5-a74e-d66c383570a4\") " pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.297492 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.297470 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/21d470a0-3408-42d5-a74e-d66c383570a4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4dhrq\" (UID: \"21d470a0-3408-42d5-a74e-d66c383570a4\") " pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.305219 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.305180 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b68sp\" (UniqueName: \"kubernetes.io/projected/21d470a0-3408-42d5-a74e-d66c383570a4-kube-api-access-b68sp\") pod \"node-exporter-4dhrq\" (UID: \"21d470a0-3408-42d5-a74e-d66c383570a4\") " pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.324032 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.323999 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4dhrq" Apr 16 14:00:26.334928 ip-10-0-141-131 kubenswrapper[2575]: W0416 14:00:26.334898 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21d470a0_3408_42d5_a74e_d66c383570a4.slice/crio-92bacc3d04e633bbce1db59c54a5d353412fdb1ec0eededd39e7aaa35743932b WatchSource:0}: Error finding container 92bacc3d04e633bbce1db59c54a5d353412fdb1ec0eededd39e7aaa35743932b: Status 404 returned error can't find the container with id 92bacc3d04e633bbce1db59c54a5d353412fdb1ec0eededd39e7aaa35743932b Apr 16 14:00:26.925055 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:26.925017 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4dhrq" event={"ID":"21d470a0-3408-42d5-a74e-d66c383570a4","Type":"ContainerStarted","Data":"92bacc3d04e633bbce1db59c54a5d353412fdb1ec0eededd39e7aaa35743932b"} Apr 16 14:00:30.748556 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:30.748521 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-pn2zh"] Apr 16 14:00:30.753056 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:30.753032 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-pn2zh" Apr 16 14:00:30.755111 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:30.755084 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-wlnkr\"" Apr 16 14:00:30.755111 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:30.755106 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 14:00:30.758976 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:30.758952 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-pn2zh"] Apr 16 14:00:30.824920 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:30.824864 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a3cc9243-9cb5-4af0-8004-572a25c168d1-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-pn2zh\" (UID: \"a3cc9243-9cb5-4af0-8004-572a25c168d1\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-pn2zh" Apr 16 14:00:30.925532 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:30.925490 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a3cc9243-9cb5-4af0-8004-572a25c168d1-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-pn2zh\" (UID: \"a3cc9243-9cb5-4af0-8004-572a25c168d1\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-pn2zh" Apr 16 14:00:30.925709 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:00:30.925658 2575 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 14:00:30.925779 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:00:30.925759 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3cc9243-9cb5-4af0-8004-572a25c168d1-monitoring-plugin-cert podName:a3cc9243-9cb5-4af0-8004-572a25c168d1 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:31.425738917 +0000 UTC m=+205.603987769 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/a3cc9243-9cb5-4af0-8004-572a25c168d1-monitoring-plugin-cert") pod "monitoring-plugin-5876b4bbc7-pn2zh" (UID: "a3cc9243-9cb5-4af0-8004-572a25c168d1") : secret "monitoring-plugin-cert" not found Apr 16 14:00:31.429399 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:31.429358 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a3cc9243-9cb5-4af0-8004-572a25c168d1-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-pn2zh\" (UID: \"a3cc9243-9cb5-4af0-8004-572a25c168d1\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-pn2zh" Apr 16 14:00:31.432368 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:31.432341 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a3cc9243-9cb5-4af0-8004-572a25c168d1-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-pn2zh\" (UID: \"a3cc9243-9cb5-4af0-8004-572a25c168d1\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-pn2zh" Apr 16 14:00:31.665323 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:31.665279 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-pn2zh" Apr 16 14:00:31.847373 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:31.847246 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-pn2zh"] Apr 16 14:00:31.849796 ip-10-0-141-131 kubenswrapper[2575]: W0416 14:00:31.849770 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3cc9243_9cb5_4af0_8004_572a25c168d1.slice/crio-8be8580cfdb33a939ed0242d448b07697f982a717942d984a92e019569ead35c WatchSource:0}: Error finding container 8be8580cfdb33a939ed0242d448b07697f982a717942d984a92e019569ead35c: Status 404 returned error can't find the container with id 8be8580cfdb33a939ed0242d448b07697f982a717942d984a92e019569ead35c Apr 16 14:00:31.939801 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:31.939759 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4dhrq" event={"ID":"21d470a0-3408-42d5-a74e-d66c383570a4","Type":"ContainerStarted","Data":"5e9ed8a18329cea524a31548a60928e1e1a475ba9e5a1d4e9ada4495d1a8d5ef"} Apr 16 14:00:31.940824 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:31.940798 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-pn2zh" event={"ID":"a3cc9243-9cb5-4af0-8004-572a25c168d1","Type":"ContainerStarted","Data":"8be8580cfdb33a939ed0242d448b07697f982a717942d984a92e019569ead35c"} Apr 16 14:00:32.234458 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.234421 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:00:32.238318 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.238291 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.244559 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.244524 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 14:00:32.244838 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.244817 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 14:00:32.244966 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.244921 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 14:00:32.244966 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.244926 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 14:00:32.245098 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.245018 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 14:00:32.245227 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.245207 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 14:00:32.245476 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.245437 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 14:00:32.245476 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.245455 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 14:00:32.245619 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.245534 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-vjgbt\"" Apr 16 14:00:32.245922 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.245890 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-71ii1ki6lmv05\"" Apr 16 14:00:32.246036 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.246013 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 14:00:32.246405 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.246274 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 14:00:32.246473 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.246413 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 14:00:32.249262 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.249240 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 14:00:32.250916 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.250892 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 14:00:32.255547 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.255518 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:00:32.339099 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.339015 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.339099 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.339049 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.339099 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.339073 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.339367 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.339151 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.339367 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.339247 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.339367 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.339287 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-web-config\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.339367 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.339329 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.339367 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.339356 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.339571 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.339382 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-config-out\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.339571 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.339404 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.339571 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.339439 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.339571 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.339491 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.339571 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.339552 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.339824 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.339575 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-config\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.339824 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.339611 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqfsn\" (UniqueName: \"kubernetes.io/projected/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-kube-api-access-vqfsn\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.339824 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.339650 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.339824 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.339673 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.339824 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.339757 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.441042 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.441008 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.441276 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.441054 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.441276 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.441089 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-web-config\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.441276 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.441121 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.441276 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.441147 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.441276 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.441174 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-config-out\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.441276 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.441199 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.441276 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.441221 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.441276 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.441259 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.441686 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.441305 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.441686 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.441341 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-config\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.441686 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.441373 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vqfsn\" (UniqueName: \"kubernetes.io/projected/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-kube-api-access-vqfsn\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.441686 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.441410 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.441686 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.441434 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.441686 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.441481 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.441686 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.441541 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.441686 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.441568 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.441686 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.441604 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.443513 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.442510 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.443513 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.443194 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.444785 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.444462 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.445648 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.445620 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.445648 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.445634 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.446674 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.446624 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.446984 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.446690 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-config\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.447758 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.447715 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.447854 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.447821 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.447854 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.447833 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.450895 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.450872 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-web-config\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.452811 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.452034 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-config-out\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.452811 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.452346 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.453297 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.453254 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.453386 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.453318 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.453560 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.453509 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.453936 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.453912 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.456115 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.456095 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqfsn\" (UniqueName: \"kubernetes.io/projected/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-kube-api-access-vqfsn\") pod \"prometheus-k8s-0\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.550383 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.550048 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:32.695616 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.695334 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:00:32.698025 ip-10-0-141-131 kubenswrapper[2575]: W0416 14:00:32.697997 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd52cfce3_aa81_49f2_b4a3_fb9a60960e2a.slice/crio-ffb3770ce7fd1aefd1a4460410c4396fce5821f28e6f1fcaa4d3a005cd58f028 WatchSource:0}: Error finding container ffb3770ce7fd1aefd1a4460410c4396fce5821f28e6f1fcaa4d3a005cd58f028: Status 404 returned error can't find the container with id ffb3770ce7fd1aefd1a4460410c4396fce5821f28e6f1fcaa4d3a005cd58f028 Apr 16 14:00:32.945262 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.945170 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a","Type":"ContainerStarted","Data":"ffb3770ce7fd1aefd1a4460410c4396fce5821f28e6f1fcaa4d3a005cd58f028"} Apr 16 14:00:32.946702 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.946672 2575 generic.go:358] "Generic (PLEG): container finished" podID="21d470a0-3408-42d5-a74e-d66c383570a4" containerID="5e9ed8a18329cea524a31548a60928e1e1a475ba9e5a1d4e9ada4495d1a8d5ef" exitCode=0 Apr 16 14:00:32.946844 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:32.946716 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4dhrq" event={"ID":"21d470a0-3408-42d5-a74e-d66c383570a4","Type":"ContainerDied","Data":"5e9ed8a18329cea524a31548a60928e1e1a475ba9e5a1d4e9ada4495d1a8d5ef"} Apr 16 14:00:33.384875 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:33.384841 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6b9d68db7f-fjmrf"] Apr 16 14:00:33.385096 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:00:33.385076 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" podUID="f6503505-8fc2-4e22-b559-ade573fe4d03" Apr 16 14:00:33.951240 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:33.951196 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-pn2zh" event={"ID":"a3cc9243-9cb5-4af0-8004-572a25c168d1","Type":"ContainerStarted","Data":"c1383b2a6919c911f042e069a6ab8c61937c064af7741460dbbd972ae2afe0cd"} Apr 16 14:00:33.951742 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:33.951427 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-pn2zh" Apr 16 14:00:33.953694 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:33.953657 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4dhrq" event={"ID":"21d470a0-3408-42d5-a74e-d66c383570a4","Type":"ContainerStarted","Data":"76a7be005d0baf8aff61b8bde78d875c1d6ac412961f8b2b78a64d7aed4849c1"} Apr 16 14:00:33.953863 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:33.953698 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4dhrq" event={"ID":"21d470a0-3408-42d5-a74e-d66c383570a4","Type":"ContainerStarted","Data":"08a678d6d75f36abb9a76f39b3f7e430bf838fbfbc273e16606e8a51847dd520"} Apr 16 14:00:33.953863 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:33.953742 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 14:00:33.957503 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:33.957477 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-pn2zh" Apr 16 14:00:33.959618 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:33.959601 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 14:00:33.968431 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:33.968382 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-pn2zh" podStartSLOduration=2.739953409 podStartE2EDuration="3.968366781s" podCreationTimestamp="2026-04-16 14:00:30 +0000 UTC" firstStartedPulling="2026-04-16 14:00:31.851535534 +0000 UTC m=+206.029784380" lastFinishedPulling="2026-04-16 14:00:33.079948898 +0000 UTC m=+207.258197752" observedRunningTime="2026-04-16 14:00:33.967190952 +0000 UTC m=+208.145439821" watchObservedRunningTime="2026-04-16 14:00:33.968366781 +0000 UTC m=+208.146615650" Apr 16 14:00:34.002887 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:34.002810 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-4dhrq" podStartSLOduration=3.570596794 podStartE2EDuration="9.002790165s" podCreationTimestamp="2026-04-16 14:00:25 +0000 UTC" firstStartedPulling="2026-04-16 14:00:26.337193725 +0000 UTC m=+200.515442571" lastFinishedPulling="2026-04-16 14:00:31.769387079 +0000 UTC m=+205.947635942" observedRunningTime="2026-04-16 14:00:34.002181489 +0000 UTC m=+208.180430358" watchObservedRunningTime="2026-04-16 14:00:34.002790165 +0000 UTC m=+208.181039034" Apr 16 14:00:34.055649 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:34.055614 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-certificates\") pod \"f6503505-8fc2-4e22-b559-ade573fe4d03\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " Apr 16 14:00:34.055842 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:34.055675 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-bound-sa-token\") pod \"f6503505-8fc2-4e22-b559-ade573fe4d03\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " Apr 16 14:00:34.055842 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:34.055704 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stsnk\" (UniqueName: \"kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-kube-api-access-stsnk\") pod \"f6503505-8fc2-4e22-b559-ade573fe4d03\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " Apr 16 14:00:34.055842 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:34.055766 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f6503505-8fc2-4e22-b559-ade573fe4d03-installation-pull-secrets\") pod \"f6503505-8fc2-4e22-b559-ade573fe4d03\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " Apr 16 14:00:34.055842 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:34.055802 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f6503505-8fc2-4e22-b559-ade573fe4d03-ca-trust-extracted\") pod \"f6503505-8fc2-4e22-b559-ade573fe4d03\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " Apr 16 14:00:34.056042 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:34.055844 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f6503505-8fc2-4e22-b559-ade573fe4d03-image-registry-private-configuration\") pod \"f6503505-8fc2-4e22-b559-ade573fe4d03\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " Apr 16 14:00:34.056042 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:34.055897 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6503505-8fc2-4e22-b559-ade573fe4d03-trusted-ca\") pod \"f6503505-8fc2-4e22-b559-ade573fe4d03\" (UID: \"f6503505-8fc2-4e22-b559-ade573fe4d03\") " Apr 16 14:00:34.056471 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:34.056156 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6503505-8fc2-4e22-b559-ade573fe4d03-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f6503505-8fc2-4e22-b559-ade573fe4d03" (UID: "f6503505-8fc2-4e22-b559-ade573fe4d03"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:00:34.056471 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:34.056060 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f6503505-8fc2-4e22-b559-ade573fe4d03" (UID: "f6503505-8fc2-4e22-b559-ade573fe4d03"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:00:34.056471 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:34.056385 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6503505-8fc2-4e22-b559-ade573fe4d03-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f6503505-8fc2-4e22-b559-ade573fe4d03" (UID: "f6503505-8fc2-4e22-b559-ade573fe4d03"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:00:34.056860 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:34.056835 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-certificates\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:00:34.056974 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:34.056865 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f6503505-8fc2-4e22-b559-ade573fe4d03-ca-trust-extracted\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:00:34.056974 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:34.056883 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6503505-8fc2-4e22-b559-ade573fe4d03-trusted-ca\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:00:34.058547 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:34.058518 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6503505-8fc2-4e22-b559-ade573fe4d03-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f6503505-8fc2-4e22-b559-ade573fe4d03" (UID: "f6503505-8fc2-4e22-b559-ade573fe4d03"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:00:34.058649 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:34.058542 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f6503505-8fc2-4e22-b559-ade573fe4d03" (UID: "f6503505-8fc2-4e22-b559-ade573fe4d03"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:00:34.058880 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:34.058860 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6503505-8fc2-4e22-b559-ade573fe4d03-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "f6503505-8fc2-4e22-b559-ade573fe4d03" (UID: "f6503505-8fc2-4e22-b559-ade573fe4d03"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:00:34.059079 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:34.059057 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-kube-api-access-stsnk" (OuterVolumeSpecName: "kube-api-access-stsnk") pod "f6503505-8fc2-4e22-b559-ade573fe4d03" (UID: "f6503505-8fc2-4e22-b559-ade573fe4d03"). InnerVolumeSpecName "kube-api-access-stsnk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:00:34.158081 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:34.158044 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f6503505-8fc2-4e22-b559-ade573fe4d03-installation-pull-secrets\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:00:34.158081 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:34.158077 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f6503505-8fc2-4e22-b559-ade573fe4d03-image-registry-private-configuration\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:00:34.158081 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:34.158088 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-bound-sa-token\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:00:34.158327 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:34.158098 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-stsnk\" (UniqueName: \"kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-kube-api-access-stsnk\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:00:34.961123 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:34.961087 2575 generic.go:358] "Generic (PLEG): container finished" podID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerID="9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405" exitCode=0 Apr 16 14:00:34.961503 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:34.961166 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a","Type":"ContainerDied","Data":"9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405"} Apr 16 14:00:34.961503 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:34.961236 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b9d68db7f-fjmrf" Apr 16 14:00:35.020038 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:35.020008 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6b9d68db7f-fjmrf"] Apr 16 14:00:35.023856 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:35.023828 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6b9d68db7f-fjmrf"] Apr 16 14:00:35.067828 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:35.067793 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6503505-8fc2-4e22-b559-ade573fe4d03-registry-tls\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:00:36.391975 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:36.391936 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6503505-8fc2-4e22-b559-ade573fe4d03" path="/var/lib/kubelet/pods/f6503505-8fc2-4e22-b559-ade573fe4d03/volumes" Apr 16 14:00:37.971596 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:37.971560 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a","Type":"ContainerStarted","Data":"8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c"} Apr 16 14:00:37.971596 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:37.971602 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a","Type":"ContainerStarted","Data":"77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8"} Apr 16 14:00:38.387127 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:38.387097 2575 scope.go:117] "RemoveContainer" containerID="53104a9ef0427c53337652dbcddd1d653880e70e91a6e09f88461acfaa6071c4" Apr 16 14:00:38.976519 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:38.976486 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zxb7l_24046d5b-b6df-4005-85d5-01cafc82cc40/console-operator/2.log" Apr 16 14:00:38.976983 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:38.976610 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" event={"ID":"24046d5b-b6df-4005-85d5-01cafc82cc40","Type":"ContainerStarted","Data":"0d4888a89352bf205da99077c2f46c95c04fb128f8232f3c645ed9fe3e4e44bc"} Apr 16 14:00:38.976983 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:38.976914 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" Apr 16 14:00:38.994878 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:38.994825 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" podStartSLOduration=53.921965738 podStartE2EDuration="56.994807864s" podCreationTimestamp="2026-04-16 13:59:42 +0000 UTC" firstStartedPulling="2026-04-16 13:59:43.566627014 +0000 UTC m=+157.744875861" lastFinishedPulling="2026-04-16 13:59:46.639469141 +0000 UTC m=+160.817717987" observedRunningTime="2026-04-16 14:00:38.993087776 +0000 UTC m=+213.171336643" watchObservedRunningTime="2026-04-16 14:00:38.994807864 +0000 UTC m=+213.173056731" Apr 16 14:00:39.795279 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:39.795243 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-zxb7l" Apr 16 14:00:39.988374 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:39.988336 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a","Type":"ContainerStarted","Data":"183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5"} Apr 16 14:00:39.988826 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:39.988380 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a","Type":"ContainerStarted","Data":"32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde"} Apr 16 14:00:39.988826 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:39.988396 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a","Type":"ContainerStarted","Data":"432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215"} Apr 16 14:00:39.988826 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:39.988408 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a","Type":"ContainerStarted","Data":"7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573"} Apr 16 14:00:40.017389 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:40.017328 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.058008772 podStartE2EDuration="8.017309668s" podCreationTimestamp="2026-04-16 14:00:32 +0000 UTC" firstStartedPulling="2026-04-16 14:00:32.699949886 +0000 UTC m=+206.878198732" lastFinishedPulling="2026-04-16 14:00:39.659250778 +0000 UTC m=+213.837499628" observedRunningTime="2026-04-16 14:00:40.016315155 +0000 UTC m=+214.194564046" watchObservedRunningTime="2026-04-16 14:00:40.017309668 +0000 UTC m=+214.195558537" Apr 16 14:00:42.551307 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:42.551268 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:00:58.037481 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:58.037447 2575 generic.go:358] "Generic (PLEG): container finished" podID="18e2e1da-09fc-4969-99a4-1d53b1a12d83" containerID="9f43884104bf23b1d2ebaf5bf1f2429abd76b2baa1d7149a4e12d51c967ad5b7" exitCode=0 Apr 16 14:00:58.037931 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:58.037518 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-5zl6t" event={"ID":"18e2e1da-09fc-4969-99a4-1d53b1a12d83","Type":"ContainerDied","Data":"9f43884104bf23b1d2ebaf5bf1f2429abd76b2baa1d7149a4e12d51c967ad5b7"} Apr 16 14:00:58.037931 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:58.037921 2575 scope.go:117] "RemoveContainer" containerID="9f43884104bf23b1d2ebaf5bf1f2429abd76b2baa1d7149a4e12d51c967ad5b7" Apr 16 14:00:59.041612 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:00:59.041576 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-5zl6t" event={"ID":"18e2e1da-09fc-4969-99a4-1d53b1a12d83","Type":"ContainerStarted","Data":"be4c00230be5c6c5ca5310ea6b1fc3693240af22de6a5e9cc71112f55c374474"} Apr 16 14:01:01.535553 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:01.535523 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-pn2zh_a3cc9243-9cb5-4af0-8004-572a25c168d1/monitoring-plugin/0.log" Apr 16 14:01:01.736087 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:01.736061 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4dhrq_21d470a0-3408-42d5-a74e-d66c383570a4/init-textfile/0.log" Apr 16 14:01:01.936712 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:01.936687 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4dhrq_21d470a0-3408-42d5-a74e-d66c383570a4/node-exporter/0.log" Apr 16 14:01:02.136564 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:02.136538 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4dhrq_21d470a0-3408-42d5-a74e-d66c383570a4/kube-rbac-proxy/0.log" Apr 16 14:01:04.138670 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:04.138629 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d52cfce3-aa81-49f2-b4a3-fb9a60960e2a/init-config-reloader/0.log" Apr 16 14:01:04.341017 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:04.340976 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d52cfce3-aa81-49f2-b4a3-fb9a60960e2a/prometheus/0.log" Apr 16 14:01:04.536533 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:04.536443 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d52cfce3-aa81-49f2-b4a3-fb9a60960e2a/config-reloader/0.log" Apr 16 14:01:04.736625 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:04.736595 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d52cfce3-aa81-49f2-b4a3-fb9a60960e2a/thanos-sidecar/0.log" Apr 16 14:01:04.936359 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:04.936331 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d52cfce3-aa81-49f2-b4a3-fb9a60960e2a/kube-rbac-proxy-web/0.log" Apr 16 14:01:05.136131 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:05.136104 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d52cfce3-aa81-49f2-b4a3-fb9a60960e2a/kube-rbac-proxy/0.log" Apr 16 14:01:05.338715 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:05.338645 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d52cfce3-aa81-49f2-b4a3-fb9a60960e2a/kube-rbac-proxy-thanos/0.log" Apr 16 14:01:05.537804 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:05.537774 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-xhcvf_ea940c5e-d3a3-4fb9-8758-aaada6c8070d/prometheus-operator/0.log" Apr 16 14:01:05.735848 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:05.735819 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-xhcvf_ea940c5e-d3a3-4fb9-8758-aaada6c8070d/kube-rbac-proxy/0.log" Apr 16 14:01:07.337182 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:07.337154 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-6mm5s_e84621c2-6f3b-487c-8a13-426a6d91539c/networking-console-plugin/0.log" Apr 16 14:01:07.536996 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:07.536971 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zxb7l_24046d5b-b6df-4005-85d5-01cafc82cc40/console-operator/2.log" Apr 16 14:01:07.737272 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:07.737241 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zxb7l_24046d5b-b6df-4005-85d5-01cafc82cc40/console-operator/3.log" Apr 16 14:01:08.073897 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:08.073793 2575 generic.go:358] "Generic (PLEG): container finished" podID="e8c123a6-dc70-4989-aff2-c7374863a689" containerID="3a8385718364b68591b2ddebde45ad0fe00c9e496a774f7dc81b33807c796bad" exitCode=0 Apr 16 14:01:08.073897 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:08.073844 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qsrlw" event={"ID":"e8c123a6-dc70-4989-aff2-c7374863a689","Type":"ContainerDied","Data":"3a8385718364b68591b2ddebde45ad0fe00c9e496a774f7dc81b33807c796bad"} Apr 16 14:01:08.074173 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:08.074158 2575 scope.go:117] "RemoveContainer" containerID="3a8385718364b68591b2ddebde45ad0fe00c9e496a774f7dc81b33807c796bad" Apr 16 14:01:09.078810 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:09.078776 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qsrlw" event={"ID":"e8c123a6-dc70-4989-aff2-c7374863a689","Type":"ContainerStarted","Data":"4c1276652dbbd04694d3e8ecc6730eea4d25e146bcc3e439563ad1adb69053cb"} Apr 16 14:01:09.935335 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:09.935308 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hvqjp_1b680da6-ab85-4c31-98d8-35be4b07624b/dns-node-resolver/0.log" Apr 16 14:01:17.147287 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:17.147247 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs\") pod \"network-metrics-daemon-crlsp\" (UID: \"77d59171-3e29-4e55-a4d9-a076a67a50ce\") " pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 14:01:17.149671 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:17.149642 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77d59171-3e29-4e55-a4d9-a076a67a50ce-metrics-certs\") pod \"network-metrics-daemon-crlsp\" (UID: \"77d59171-3e29-4e55-a4d9-a076a67a50ce\") " pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 14:01:17.390123 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:17.390091 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jz268\"" Apr 16 14:01:17.397935 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:17.397878 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-crlsp" Apr 16 14:01:17.517904 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:17.517870 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-crlsp"] Apr 16 14:01:17.521520 ip-10-0-141-131 kubenswrapper[2575]: W0416 14:01:17.521483 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77d59171_3e29_4e55_a4d9_a076a67a50ce.slice/crio-3576dd21fbed44254997542daa3f6d7f6312853422561894a7af9abab256c2a9 WatchSource:0}: Error finding container 3576dd21fbed44254997542daa3f6d7f6312853422561894a7af9abab256c2a9: Status 404 returned error can't find the container with id 3576dd21fbed44254997542daa3f6d7f6312853422561894a7af9abab256c2a9 Apr 16 14:01:18.105219 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:18.105188 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-crlsp" event={"ID":"77d59171-3e29-4e55-a4d9-a076a67a50ce","Type":"ContainerStarted","Data":"3576dd21fbed44254997542daa3f6d7f6312853422561894a7af9abab256c2a9"} Apr 16 14:01:19.109576 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:19.109539 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-crlsp" event={"ID":"77d59171-3e29-4e55-a4d9-a076a67a50ce","Type":"ContainerStarted","Data":"234b49b6774f7fbdac8062b34cff59dfe0769c21bf3fa3726680c4f784dae7b6"} Apr 16 14:01:19.109943 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:19.109584 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-crlsp" event={"ID":"77d59171-3e29-4e55-a4d9-a076a67a50ce","Type":"ContainerStarted","Data":"a5db2e49d20061ac7cae2bcfca08757bf58bc5cd8b643ed469ffa53ad58a349a"} Apr 16 14:01:19.127530 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:19.127478 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-crlsp" podStartSLOduration=251.800990992 podStartE2EDuration="4m13.127463498s" podCreationTimestamp="2026-04-16 13:57:06 +0000 UTC" firstStartedPulling="2026-04-16 14:01:17.523453875 +0000 UTC m=+251.701702721" lastFinishedPulling="2026-04-16 14:01:18.849926363 +0000 UTC m=+253.028175227" observedRunningTime="2026-04-16 14:01:19.125467542 +0000 UTC m=+253.303716407" watchObservedRunningTime="2026-04-16 14:01:19.127463498 +0000 UTC m=+253.305712366" Apr 16 14:01:32.550696 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:32.550630 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:32.570629 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:32.570604 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:33.166239 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:33.166204 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:45.771047 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:01:45.771002 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-hgrw6" podUID="b77dfd1a-6bd6-449b-8db3-c93ff41eb18e" Apr 16 14:01:45.771047 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:01:45.771007 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-g5lb6" podUID="92799234-6fff-45be-a27c-c70096483d30" Apr 16 14:01:46.186597 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:46.186565 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hgrw6" Apr 16 14:01:46.186801 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:46.186751 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g5lb6" Apr 16 14:01:49.312667 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:49.312627 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert\") pod \"ingress-canary-hgrw6\" (UID: \"b77dfd1a-6bd6-449b-8db3-c93ff41eb18e\") " pod="openshift-ingress-canary/ingress-canary-hgrw6" Apr 16 14:01:49.313070 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:49.312689 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls\") pod \"dns-default-g5lb6\" (UID: \"92799234-6fff-45be-a27c-c70096483d30\") " pod="openshift-dns/dns-default-g5lb6" Apr 16 14:01:49.315070 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:49.315049 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92799234-6fff-45be-a27c-c70096483d30-metrics-tls\") pod \"dns-default-g5lb6\" (UID: \"92799234-6fff-45be-a27c-c70096483d30\") " pod="openshift-dns/dns-default-g5lb6" Apr 16 14:01:49.315227 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:49.315204 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b77dfd1a-6bd6-449b-8db3-c93ff41eb18e-cert\") pod \"ingress-canary-hgrw6\" (UID: \"b77dfd1a-6bd6-449b-8db3-c93ff41eb18e\") " pod="openshift-ingress-canary/ingress-canary-hgrw6" Apr 16 14:01:49.489981 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:49.489950 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ftzrr\"" Apr 16 14:01:49.490764 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:49.490746 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mv45m\"" Apr 16 14:01:49.497688 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:49.497663 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g5lb6" Apr 16 14:01:49.497839 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:49.497803 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hgrw6" Apr 16 14:01:49.622859 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:49.622825 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hgrw6"] Apr 16 14:01:49.626738 ip-10-0-141-131 kubenswrapper[2575]: W0416 14:01:49.626693 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb77dfd1a_6bd6_449b_8db3_c93ff41eb18e.slice/crio-cbc3bb0399635f224058e20cde5ec04ab4bef79dcbd8e14a8b4e00acddf08fac WatchSource:0}: Error finding container cbc3bb0399635f224058e20cde5ec04ab4bef79dcbd8e14a8b4e00acddf08fac: Status 404 returned error can't find the container with id cbc3bb0399635f224058e20cde5ec04ab4bef79dcbd8e14a8b4e00acddf08fac Apr 16 14:01:49.646543 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:49.646513 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-g5lb6"] Apr 16 14:01:49.649593 ip-10-0-141-131 kubenswrapper[2575]: W0416 14:01:49.649557 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92799234_6fff_45be_a27c_c70096483d30.slice/crio-e7ad162079f02f073b084718699cda72531772b85af7645fa8491f04fc4f8840 WatchSource:0}: Error finding container e7ad162079f02f073b084718699cda72531772b85af7645fa8491f04fc4f8840: Status 404 returned error can't find the container with id e7ad162079f02f073b084718699cda72531772b85af7645fa8491f04fc4f8840 Apr 16 14:01:50.200295 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:50.200259 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hgrw6" event={"ID":"b77dfd1a-6bd6-449b-8db3-c93ff41eb18e","Type":"ContainerStarted","Data":"cbc3bb0399635f224058e20cde5ec04ab4bef79dcbd8e14a8b4e00acddf08fac"} Apr 16 14:01:50.201407 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:50.201372 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g5lb6" event={"ID":"92799234-6fff-45be-a27c-c70096483d30","Type":"ContainerStarted","Data":"e7ad162079f02f073b084718699cda72531772b85af7645fa8491f04fc4f8840"} Apr 16 14:01:51.206814 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:51.206754 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g5lb6" event={"ID":"92799234-6fff-45be-a27c-c70096483d30","Type":"ContainerStarted","Data":"7c20fd85527e9ce0affe64423eea235c4de9308a5544533b67f8b79f1737aa3d"} Apr 16 14:01:52.211087 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:52.211047 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hgrw6" event={"ID":"b77dfd1a-6bd6-449b-8db3-c93ff41eb18e","Type":"ContainerStarted","Data":"5ac0f81fdc9038906ffe820eac5349a07ceda002315acb0fe95a9ab0075a4ab9"} Apr 16 14:01:52.212706 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:52.212681 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g5lb6" event={"ID":"92799234-6fff-45be-a27c-c70096483d30","Type":"ContainerStarted","Data":"455ab177ec1c8491af17b1c39835300e82fd7f839819655a77feb4c6ccc6307e"} Apr 16 14:01:52.212838 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:52.212771 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-g5lb6" Apr 16 14:01:52.229145 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:52.229091 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-hgrw6" podStartSLOduration=251.121611574 podStartE2EDuration="4m13.229076243s" podCreationTimestamp="2026-04-16 13:57:39 +0000 UTC" firstStartedPulling="2026-04-16 14:01:49.629212541 +0000 UTC m=+283.807461386" lastFinishedPulling="2026-04-16 14:01:51.736677209 +0000 UTC m=+285.914926055" observedRunningTime="2026-04-16 14:01:52.22729409 +0000 UTC m=+286.405542958" watchObservedRunningTime="2026-04-16 14:01:52.229076243 +0000 UTC m=+286.407325111" Apr 16 14:01:52.244140 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:52.243816 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-g5lb6" podStartSLOduration=251.908658752 podStartE2EDuration="4m13.243797363s" podCreationTimestamp="2026-04-16 13:57:39 +0000 UTC" firstStartedPulling="2026-04-16 14:01:49.651385844 +0000 UTC m=+283.829634689" lastFinishedPulling="2026-04-16 14:01:50.986524434 +0000 UTC m=+285.164773300" observedRunningTime="2026-04-16 14:01:52.243588838 +0000 UTC m=+286.421837740" watchObservedRunningTime="2026-04-16 14:01:52.243797363 +0000 UTC m=+286.422046267" Apr 16 14:01:54.421294 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.421254 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:01:54.421965 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.421928 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerName="prometheus" containerID="cri-o://77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8" gracePeriod=600 Apr 16 14:01:54.422177 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.422153 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerName="kube-rbac-proxy-thanos" containerID="cri-o://183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5" gracePeriod=600 Apr 16 14:01:54.422277 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.422083 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerName="kube-rbac-proxy" containerID="cri-o://32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde" gracePeriod=600 Apr 16 14:01:54.422370 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.422351 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerName="thanos-sidecar" containerID="cri-o://7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573" gracePeriod=600 Apr 16 14:01:54.422453 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.422420 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerName="kube-rbac-proxy-web" containerID="cri-o://432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215" gracePeriod=600 Apr 16 14:01:54.423141 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.422445 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerName="config-reloader" containerID="cri-o://8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c" gracePeriod=600 Apr 16 14:01:54.661238 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.661214 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:54.759394 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.759318 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-config-out\") pod \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " Apr 16 14:01:54.759394 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.759386 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-configmap-kubelet-serving-ca-bundle\") pod \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " Apr 16 14:01:54.759607 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.759413 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqfsn\" (UniqueName: \"kubernetes.io/projected/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-kube-api-access-vqfsn\") pod \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " Apr 16 14:01:54.759607 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.759444 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-prometheus-k8s-tls\") pod \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " Apr 16 14:01:54.759607 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.759472 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-configmap-serving-certs-ca-bundle\") pod \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " Apr 16 14:01:54.759607 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.759494 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-config\") pod \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " Apr 16 14:01:54.759607 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.759519 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-configmap-metrics-client-ca\") pod \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " Apr 16 14:01:54.759607 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.759550 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-web-config\") pod \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " Apr 16 14:01:54.759607 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.759573 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-tls-assets\") pod \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " Apr 16 14:01:54.759995 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.759615 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " Apr 16 14:01:54.759995 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.759655 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-prometheus-k8s-rulefiles-0\") pod \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " Apr 16 14:01:54.759995 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.759684 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-metrics-client-certs\") pod \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " Apr 16 14:01:54.759995 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.759751 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-kube-rbac-proxy\") pod \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " Apr 16 14:01:54.759995 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.759787 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " Apr 16 14:01:54.759995 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.759811 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-grpc-tls\") pod \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " Apr 16 14:01:54.759995 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.759820 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" (UID: "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:01:54.759995 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.759835 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-prometheus-trusted-ca-bundle\") pod \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " Apr 16 14:01:54.759995 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.759898 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-thanos-prometheus-http-client-file\") pod \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " Apr 16 14:01:54.759995 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.759915 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" (UID: "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:01:54.759995 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.759931 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-prometheus-k8s-db\") pod \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\" (UID: \"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a\") " Apr 16 14:01:54.760530 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.760152 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" (UID: "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:01:54.760530 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.760222 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:01:54.760530 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.760242 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:01:54.760530 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.760258 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-prometheus-trusted-ca-bundle\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:01:54.760912 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.760880 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" (UID: "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:01:54.762212 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.762183 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-kube-api-access-vqfsn" (OuterVolumeSpecName: "kube-api-access-vqfsn") pod "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" (UID: "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a"). InnerVolumeSpecName "kube-api-access-vqfsn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:01:54.762393 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.762357 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" (UID: "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:01:54.762553 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.762529 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-config-out" (OuterVolumeSpecName: "config-out") pod "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" (UID: "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:01:54.762771 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.762717 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" (UID: "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:01:54.762771 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.762762 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" (UID: "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:01:54.763530 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.763497 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" (UID: "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:01:54.763629 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.763507 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" (UID: "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:01:54.763695 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.763641 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" (UID: "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:01:54.764163 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.764132 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-config" (OuterVolumeSpecName: "config") pod "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" (UID: "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:01:54.764578 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.764561 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" (UID: "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:01:54.764955 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.764936 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" (UID: "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:01:54.765044 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.765031 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" (UID: "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:01:54.765778 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.765757 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" (UID: "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:01:54.773949 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.773923 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-web-config" (OuterVolumeSpecName: "web-config") pod "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" (UID: "d52cfce3-aa81-49f2-b4a3-fb9a60960e2a"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:01:54.861402 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.861363 2575 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-thanos-prometheus-http-client-file\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:01:54.861402 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.861394 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-prometheus-k8s-db\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:01:54.861402 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.861405 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-config-out\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:01:54.861402 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.861414 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vqfsn\" (UniqueName: \"kubernetes.io/projected/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-kube-api-access-vqfsn\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:01:54.861649 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.861424 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-prometheus-k8s-tls\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:01:54.861649 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.861434 2575 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-config\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:01:54.861649 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.861443 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-configmap-metrics-client-ca\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:01:54.861649 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.861452 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-web-config\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:01:54.861649 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.861460 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-tls-assets\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:01:54.861649 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.861468 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:01:54.861649 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.861479 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:01:54.861649 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.861488 2575 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-metrics-client-certs\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:01:54.861649 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.861496 2575 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-kube-rbac-proxy\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:01:54.861649 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.861507 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:01:54.861649 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:54.861516 2575 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a-secret-grpc-tls\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:01:55.224920 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.224887 2575 generic.go:358] "Generic (PLEG): container finished" podID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerID="183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5" exitCode=0 Apr 16 14:01:55.224920 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.224914 2575 generic.go:358] "Generic (PLEG): container finished" podID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerID="32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde" exitCode=0 Apr 16 14:01:55.224920 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.224925 2575 generic.go:358] "Generic (PLEG): container finished" podID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerID="432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215" exitCode=0 Apr 16 14:01:55.225147 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.224934 2575 generic.go:358] "Generic (PLEG): container finished" podID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerID="7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573" exitCode=0 Apr 16 14:01:55.225147 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.224942 2575 generic.go:358] "Generic (PLEG): container finished" podID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerID="8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c" exitCode=0 Apr 16 14:01:55.225147 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.224947 2575 generic.go:358] "Generic (PLEG): container finished" podID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerID="77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8" exitCode=0 Apr 16 14:01:55.225147 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.224972 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a","Type":"ContainerDied","Data":"183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5"} Apr 16 14:01:55.225147 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.225002 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.225147 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.225013 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a","Type":"ContainerDied","Data":"32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde"} Apr 16 14:01:55.225147 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.225025 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a","Type":"ContainerDied","Data":"432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215"} Apr 16 14:01:55.225147 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.225034 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a","Type":"ContainerDied","Data":"7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573"} Apr 16 14:01:55.225147 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.225043 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a","Type":"ContainerDied","Data":"8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c"} Apr 16 14:01:55.225147 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.225052 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a","Type":"ContainerDied","Data":"77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8"} Apr 16 14:01:55.225147 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.225060 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d52cfce3-aa81-49f2-b4a3-fb9a60960e2a","Type":"ContainerDied","Data":"ffb3770ce7fd1aefd1a4460410c4396fce5821f28e6f1fcaa4d3a005cd58f028"} Apr 16 14:01:55.225147 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.225082 2575 scope.go:117] "RemoveContainer" containerID="183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5" Apr 16 14:01:55.236914 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.236885 2575 scope.go:117] "RemoveContainer" containerID="32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde" Apr 16 14:01:55.245974 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.245952 2575 scope.go:117] "RemoveContainer" containerID="432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215" Apr 16 14:01:55.252533 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.252516 2575 scope.go:117] "RemoveContainer" containerID="7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573" Apr 16 14:01:55.256262 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.256240 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:01:55.260005 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.259982 2575 scope.go:117] "RemoveContainer" containerID="8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c" Apr 16 14:01:55.266364 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.266348 2575 scope.go:117] "RemoveContainer" containerID="77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8" Apr 16 14:01:55.273389 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.273368 2575 scope.go:117] "RemoveContainer" containerID="9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405" Apr 16 14:01:55.275667 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.275646 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:01:55.280179 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.280159 2575 scope.go:117] "RemoveContainer" containerID="183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5" Apr 16 14:01:55.280426 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:01:55.280405 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5\": container with ID starting with 183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5 not found: ID does not exist" containerID="183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5" Apr 16 14:01:55.280478 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.280434 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5"} err="failed to get container status \"183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5\": rpc error: code = NotFound desc = could not find container \"183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5\": container with ID starting with 183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5 not found: ID does not exist" Apr 16 14:01:55.280478 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.280468 2575 scope.go:117] "RemoveContainer" containerID="32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde" Apr 16 14:01:55.280695 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:01:55.280680 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde\": container with ID starting with 32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde not found: ID does not exist" containerID="32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde" Apr 16 14:01:55.280754 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.280699 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde"} err="failed to get container status \"32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde\": rpc error: code = NotFound desc = could not find container \"32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde\": container with ID starting with 32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde not found: ID does not exist" Apr 16 14:01:55.280754 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.280713 2575 scope.go:117] "RemoveContainer" containerID="432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215" Apr 16 14:01:55.280932 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:01:55.280917 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215\": container with ID starting with 432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215 not found: ID does not exist" containerID="432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215" Apr 16 14:01:55.280976 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.280933 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215"} err="failed to get container status \"432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215\": rpc error: code = NotFound desc = could not find container \"432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215\": container with ID starting with 432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215 not found: ID does not exist" Apr 16 14:01:55.280976 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.280944 2575 scope.go:117] "RemoveContainer" containerID="7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573" Apr 16 14:01:55.281139 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:01:55.281122 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573\": container with ID starting with 7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573 not found: ID does not exist" containerID="7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573" Apr 16 14:01:55.281176 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.281141 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573"} err="failed to get container status \"7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573\": rpc error: code = NotFound desc = could not find container \"7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573\": container with ID starting with 7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573 not found: ID does not exist" Apr 16 14:01:55.281176 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.281152 2575 scope.go:117] "RemoveContainer" containerID="8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c" Apr 16 14:01:55.281351 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:01:55.281335 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c\": container with ID starting with 8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c not found: ID does not exist" containerID="8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c" Apr 16 14:01:55.281398 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.281353 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c"} err="failed to get container status \"8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c\": rpc error: code = NotFound desc = could not find container \"8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c\": container with ID starting with 8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c not found: ID does not exist" Apr 16 14:01:55.281398 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.281366 2575 scope.go:117] "RemoveContainer" containerID="77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8" Apr 16 14:01:55.281573 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:01:55.281558 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8\": container with ID starting with 77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8 not found: ID does not exist" containerID="77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8" Apr 16 14:01:55.281620 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.281576 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8"} err="failed to get container status \"77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8\": rpc error: code = NotFound desc = could not find container \"77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8\": container with ID starting with 77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8 not found: ID does not exist" Apr 16 14:01:55.281620 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.281589 2575 scope.go:117] "RemoveContainer" containerID="9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405" Apr 16 14:01:55.281779 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:01:55.281763 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405\": container with ID starting with 9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405 not found: ID does not exist" containerID="9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405" Apr 16 14:01:55.281828 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.281783 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405"} err="failed to get container status \"9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405\": rpc error: code = NotFound desc = could not find container \"9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405\": container with ID starting with 9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405 not found: ID does not exist" Apr 16 14:01:55.281828 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.281796 2575 scope.go:117] "RemoveContainer" containerID="183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5" Apr 16 14:01:55.282029 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.282010 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5"} err="failed to get container status \"183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5\": rpc error: code = NotFound desc = could not find container \"183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5\": container with ID starting with 183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5 not found: ID does not exist" Apr 16 14:01:55.282029 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.282028 2575 scope.go:117] "RemoveContainer" containerID="32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde" Apr 16 14:01:55.282215 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.282200 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde"} err="failed to get container status \"32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde\": rpc error: code = NotFound desc = could not find container \"32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde\": container with ID starting with 32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde not found: ID does not exist" Apr 16 14:01:55.282263 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.282215 2575 scope.go:117] "RemoveContainer" containerID="432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215" Apr 16 14:01:55.282412 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.282392 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215"} err="failed to get container status \"432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215\": rpc error: code = NotFound desc = could not find container \"432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215\": container with ID starting with 432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215 not found: ID does not exist" Apr 16 14:01:55.282456 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.282412 2575 scope.go:117] "RemoveContainer" containerID="7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573" Apr 16 14:01:55.282623 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.282598 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573"} err="failed to get container status \"7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573\": rpc error: code = NotFound desc = could not find container \"7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573\": container with ID starting with 7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573 not found: ID does not exist" Apr 16 14:01:55.282623 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.282621 2575 scope.go:117] "RemoveContainer" containerID="8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c" Apr 16 14:01:55.282853 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.282835 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c"} err="failed to get container status \"8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c\": rpc error: code = NotFound desc = could not find container \"8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c\": container with ID starting with 8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c not found: ID does not exist" Apr 16 14:01:55.282853 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.282852 2575 scope.go:117] "RemoveContainer" containerID="77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8" Apr 16 14:01:55.283038 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.283022 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8"} err="failed to get container status \"77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8\": rpc error: code = NotFound desc = could not find container \"77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8\": container with ID starting with 77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8 not found: ID does not exist" Apr 16 14:01:55.283077 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.283039 2575 scope.go:117] "RemoveContainer" containerID="9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405" Apr 16 14:01:55.283209 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.283194 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405"} err="failed to get container status \"9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405\": rpc error: code = NotFound desc = could not find container \"9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405\": container with ID starting with 9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405 not found: ID does not exist" Apr 16 14:01:55.283259 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.283210 2575 scope.go:117] "RemoveContainer" containerID="183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5" Apr 16 14:01:55.283435 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.283420 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5"} err="failed to get container status \"183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5\": rpc error: code = NotFound desc = could not find container \"183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5\": container with ID starting with 183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5 not found: ID does not exist" Apr 16 14:01:55.283435 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.283435 2575 scope.go:117] "RemoveContainer" containerID="32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde" Apr 16 14:01:55.283613 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.283598 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde"} err="failed to get container status \"32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde\": rpc error: code = NotFound desc = could not find container \"32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde\": container with ID starting with 32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde not found: ID does not exist" Apr 16 14:01:55.283648 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.283614 2575 scope.go:117] "RemoveContainer" containerID="432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215" Apr 16 14:01:55.283807 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.283790 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215"} err="failed to get container status \"432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215\": rpc error: code = NotFound desc = could not find container \"432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215\": container with ID starting with 432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215 not found: ID does not exist" Apr 16 14:01:55.283851 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.283807 2575 scope.go:117] "RemoveContainer" containerID="7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573" Apr 16 14:01:55.283998 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.283983 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573"} err="failed to get container status \"7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573\": rpc error: code = NotFound desc = could not find container \"7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573\": container with ID starting with 7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573 not found: ID does not exist" Apr 16 14:01:55.284032 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.283998 2575 scope.go:117] "RemoveContainer" containerID="8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c" Apr 16 14:01:55.284205 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.284190 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c"} err="failed to get container status \"8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c\": rpc error: code = NotFound desc = could not find container \"8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c\": container with ID starting with 8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c not found: ID does not exist" Apr 16 14:01:55.284242 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.284204 2575 scope.go:117] "RemoveContainer" containerID="77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8" Apr 16 14:01:55.284377 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.284362 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8"} err="failed to get container status \"77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8\": rpc error: code = NotFound desc = could not find container \"77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8\": container with ID starting with 77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8 not found: ID does not exist" Apr 16 14:01:55.284377 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.284376 2575 scope.go:117] "RemoveContainer" containerID="9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405" Apr 16 14:01:55.284552 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.284537 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405"} err="failed to get container status \"9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405\": rpc error: code = NotFound desc = could not find container \"9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405\": container with ID starting with 9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405 not found: ID does not exist" Apr 16 14:01:55.284552 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.284552 2575 scope.go:117] "RemoveContainer" containerID="183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5" Apr 16 14:01:55.284767 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.284752 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5"} err="failed to get container status \"183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5\": rpc error: code = NotFound desc = could not find container \"183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5\": container with ID starting with 183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5 not found: ID does not exist" Apr 16 14:01:55.284767 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.284767 2575 scope.go:117] "RemoveContainer" containerID="32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde" Apr 16 14:01:55.284938 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.284923 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde"} err="failed to get container status \"32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde\": rpc error: code = NotFound desc = could not find container \"32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde\": container with ID starting with 32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde not found: ID does not exist" Apr 16 14:01:55.284981 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.284945 2575 scope.go:117] "RemoveContainer" containerID="432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215" Apr 16 14:01:55.285159 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.285140 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215"} err="failed to get container status \"432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215\": rpc error: code = NotFound desc = could not find container \"432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215\": container with ID starting with 432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215 not found: ID does not exist" Apr 16 14:01:55.285210 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.285160 2575 scope.go:117] "RemoveContainer" containerID="7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573" Apr 16 14:01:55.285342 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.285328 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573"} err="failed to get container status \"7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573\": rpc error: code = NotFound desc = could not find container \"7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573\": container with ID starting with 7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573 not found: ID does not exist" Apr 16 14:01:55.285388 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.285343 2575 scope.go:117] "RemoveContainer" containerID="8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c" Apr 16 14:01:55.285519 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.285501 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c"} err="failed to get container status \"8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c\": rpc error: code = NotFound desc = could not find container \"8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c\": container with ID starting with 8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c not found: ID does not exist" Apr 16 14:01:55.285560 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.285520 2575 scope.go:117] "RemoveContainer" containerID="77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8" Apr 16 14:01:55.285700 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.285682 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8"} err="failed to get container status \"77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8\": rpc error: code = NotFound desc = could not find container \"77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8\": container with ID starting with 77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8 not found: ID does not exist" Apr 16 14:01:55.285777 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.285701 2575 scope.go:117] "RemoveContainer" containerID="9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405" Apr 16 14:01:55.285927 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.285910 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405"} err="failed to get container status \"9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405\": rpc error: code = NotFound desc = could not find container \"9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405\": container with ID starting with 9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405 not found: ID does not exist" Apr 16 14:01:55.285970 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.285928 2575 scope.go:117] "RemoveContainer" containerID="183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5" Apr 16 14:01:55.286145 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.286127 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5"} err="failed to get container status \"183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5\": rpc error: code = NotFound desc = could not find container \"183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5\": container with ID starting with 183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5 not found: ID does not exist" Apr 16 14:01:55.286145 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.286144 2575 scope.go:117] "RemoveContainer" containerID="32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde" Apr 16 14:01:55.286344 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.286324 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde"} err="failed to get container status \"32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde\": rpc error: code = NotFound desc = could not find container \"32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde\": container with ID starting with 32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde not found: ID does not exist" Apr 16 14:01:55.286385 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.286345 2575 scope.go:117] "RemoveContainer" containerID="432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215" Apr 16 14:01:55.286537 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.286514 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215"} err="failed to get container status \"432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215\": rpc error: code = NotFound desc = could not find container \"432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215\": container with ID starting with 432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215 not found: ID does not exist" Apr 16 14:01:55.286619 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.286539 2575 scope.go:117] "RemoveContainer" containerID="7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573" Apr 16 14:01:55.286709 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.286695 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573"} err="failed to get container status \"7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573\": rpc error: code = NotFound desc = could not find container \"7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573\": container with ID starting with 7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573 not found: ID does not exist" Apr 16 14:01:55.286757 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.286709 2575 scope.go:117] "RemoveContainer" containerID="8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c" Apr 16 14:01:55.286887 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.286872 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c"} err="failed to get container status \"8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c\": rpc error: code = NotFound desc = could not find container \"8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c\": container with ID starting with 8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c not found: ID does not exist" Apr 16 14:01:55.286941 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.286887 2575 scope.go:117] "RemoveContainer" containerID="77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8" Apr 16 14:01:55.287081 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.287057 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8"} err="failed to get container status \"77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8\": rpc error: code = NotFound desc = could not find container \"77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8\": container with ID starting with 77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8 not found: ID does not exist" Apr 16 14:01:55.287081 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.287079 2575 scope.go:117] "RemoveContainer" containerID="9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405" Apr 16 14:01:55.287273 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.287257 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405"} err="failed to get container status \"9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405\": rpc error: code = NotFound desc = could not find container \"9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405\": container with ID starting with 9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405 not found: ID does not exist" Apr 16 14:01:55.287321 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.287273 2575 scope.go:117] "RemoveContainer" containerID="183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5" Apr 16 14:01:55.287434 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.287420 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5"} err="failed to get container status \"183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5\": rpc error: code = NotFound desc = could not find container \"183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5\": container with ID starting with 183b07bc41e2b43bb6625e2e745d3ec46bdddd6e36fa0d8141251b5ee9bbb4d5 not found: ID does not exist" Apr 16 14:01:55.287474 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.287434 2575 scope.go:117] "RemoveContainer" containerID="32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde" Apr 16 14:01:55.287609 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.287594 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde"} err="failed to get container status \"32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde\": rpc error: code = NotFound desc = could not find container \"32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde\": container with ID starting with 32ee73bcc49289cf3e76e17a0cdf06ee9ea0da299bec5ae06dfa5d02f657bfde not found: ID does not exist" Apr 16 14:01:55.287609 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.287607 2575 scope.go:117] "RemoveContainer" containerID="432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215" Apr 16 14:01:55.287812 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.287793 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215"} err="failed to get container status \"432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215\": rpc error: code = NotFound desc = could not find container \"432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215\": container with ID starting with 432d75c51333edb00daed5d7a152a6366255027d92826fbbb8e750592d89d215 not found: ID does not exist" Apr 16 14:01:55.287859 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.287813 2575 scope.go:117] "RemoveContainer" containerID="7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573" Apr 16 14:01:55.288007 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.287990 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573"} err="failed to get container status \"7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573\": rpc error: code = NotFound desc = could not find container \"7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573\": container with ID starting with 7199c40c24207a30a783d90a7b745bebf5b082b44ed6c982bb352901ec770573 not found: ID does not exist" Apr 16 14:01:55.288050 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.288007 2575 scope.go:117] "RemoveContainer" containerID="8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c" Apr 16 14:01:55.288195 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.288179 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c"} err="failed to get container status \"8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c\": rpc error: code = NotFound desc = could not find container \"8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c\": container with ID starting with 8cb891b3beb30251c3ec748254abb1a31d18439895e4126684d0cca10947681c not found: ID does not exist" Apr 16 14:01:55.288233 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.288197 2575 scope.go:117] "RemoveContainer" containerID="77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8" Apr 16 14:01:55.288379 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.288364 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8"} err="failed to get container status \"77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8\": rpc error: code = NotFound desc = could not find container \"77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8\": container with ID starting with 77d234bc23983610f5f446b8f69dab2ce921fe2c329eb8c123f47e036e60cac8 not found: ID does not exist" Apr 16 14:01:55.288379 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.288379 2575 scope.go:117] "RemoveContainer" containerID="9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405" Apr 16 14:01:55.288556 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.288540 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405"} err="failed to get container status \"9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405\": rpc error: code = NotFound desc = could not find container \"9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405\": container with ID starting with 9f04343184b0e6bf6f2b140e8235d8bc08c835a9c31a52e77cac0be1ba8d5405 not found: ID does not exist" Apr 16 14:01:55.326232 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.326201 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:01:55.326503 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.326489 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerName="init-config-reloader" Apr 16 14:01:55.326548 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.326518 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerName="init-config-reloader" Apr 16 14:01:55.326548 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.326526 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerName="config-reloader" Apr 16 14:01:55.326548 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.326531 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerName="config-reloader" Apr 16 14:01:55.326548 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.326542 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerName="prometheus" Apr 16 14:01:55.326548 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.326547 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerName="prometheus" Apr 16 14:01:55.326747 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.326555 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerName="kube-rbac-proxy-thanos" Apr 16 14:01:55.326747 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.326560 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerName="kube-rbac-proxy-thanos" Apr 16 14:01:55.326747 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.326571 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerName="kube-rbac-proxy-web" Apr 16 14:01:55.326747 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.326576 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerName="kube-rbac-proxy-web" Apr 16 14:01:55.326747 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.326583 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerName="kube-rbac-proxy" Apr 16 14:01:55.326747 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.326588 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerName="kube-rbac-proxy" Apr 16 14:01:55.326747 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.326600 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerName="thanos-sidecar" Apr 16 14:01:55.326747 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.326606 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerName="thanos-sidecar" Apr 16 14:01:55.326747 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.326652 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerName="kube-rbac-proxy" Apr 16 14:01:55.326747 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.326660 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerName="kube-rbac-proxy-thanos" Apr 16 14:01:55.326747 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.326668 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerName="kube-rbac-proxy-web" Apr 16 14:01:55.326747 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.326674 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerName="thanos-sidecar" Apr 16 14:01:55.326747 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.326681 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerName="prometheus" Apr 16 14:01:55.326747 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.326686 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" containerName="config-reloader" Apr 16 14:01:55.331886 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.331867 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.335586 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.335547 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 14:01:55.335586 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.335579 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 14:01:55.335773 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.335755 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 14:01:55.335897 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.335825 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-vjgbt\"" Apr 16 14:01:55.335897 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.335825 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 14:01:55.336062 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.336043 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 14:01:55.336193 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.336108 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 14:01:55.336456 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.336435 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 14:01:55.336859 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.336811 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 14:01:55.336859 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.336818 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 14:01:55.337456 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.337441 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-71ii1ki6lmv05\"" Apr 16 14:01:55.338157 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.338125 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 14:01:55.340136 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.340115 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 14:01:55.340357 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.340340 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 14:01:55.343228 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.343211 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 14:01:55.347862 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.347841 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:01:55.466251 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.466214 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e40b9af4-afc4-44d6-8e70-b76029a57119-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.466251 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.466252 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e40b9af4-afc4-44d6-8e70-b76029a57119-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.466695 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.466287 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e40b9af4-afc4-44d6-8e70-b76029a57119-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.466695 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.466350 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e40b9af4-afc4-44d6-8e70-b76029a57119-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.466695 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.466398 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e40b9af4-afc4-44d6-8e70-b76029a57119-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.466695 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.466418 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e40b9af4-afc4-44d6-8e70-b76029a57119-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.466695 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.466435 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e40b9af4-afc4-44d6-8e70-b76029a57119-config\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.466695 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.466453 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e40b9af4-afc4-44d6-8e70-b76029a57119-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.466695 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.466470 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e40b9af4-afc4-44d6-8e70-b76029a57119-web-config\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.466695 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.466494 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e40b9af4-afc4-44d6-8e70-b76029a57119-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.466695 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.466532 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e40b9af4-afc4-44d6-8e70-b76029a57119-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.466695 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.466605 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e40b9af4-afc4-44d6-8e70-b76029a57119-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.466695 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.466648 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e40b9af4-afc4-44d6-8e70-b76029a57119-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.466695 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.466688 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjhfv\" (UniqueName: \"kubernetes.io/projected/e40b9af4-afc4-44d6-8e70-b76029a57119-kube-api-access-fjhfv\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.467174 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.466716 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e40b9af4-afc4-44d6-8e70-b76029a57119-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.467174 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.466775 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e40b9af4-afc4-44d6-8e70-b76029a57119-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.467174 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.466813 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e40b9af4-afc4-44d6-8e70-b76029a57119-config-out\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.467174 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.466849 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e40b9af4-afc4-44d6-8e70-b76029a57119-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.567791 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.567665 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e40b9af4-afc4-44d6-8e70-b76029a57119-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.567791 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.567714 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e40b9af4-afc4-44d6-8e70-b76029a57119-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.567791 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.567768 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e40b9af4-afc4-44d6-8e70-b76029a57119-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.568060 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.567813 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e40b9af4-afc4-44d6-8e70-b76029a57119-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.568060 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.567837 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e40b9af4-afc4-44d6-8e70-b76029a57119-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.568060 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.567867 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e40b9af4-afc4-44d6-8e70-b76029a57119-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.568060 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.567899 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e40b9af4-afc4-44d6-8e70-b76029a57119-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.568060 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.567921 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e40b9af4-afc4-44d6-8e70-b76029a57119-config\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.568060 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.567942 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e40b9af4-afc4-44d6-8e70-b76029a57119-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.568060 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.567964 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e40b9af4-afc4-44d6-8e70-b76029a57119-web-config\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.568060 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.567993 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e40b9af4-afc4-44d6-8e70-b76029a57119-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.568060 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.568022 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e40b9af4-afc4-44d6-8e70-b76029a57119-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.568060 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.568055 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e40b9af4-afc4-44d6-8e70-b76029a57119-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.568526 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.568088 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e40b9af4-afc4-44d6-8e70-b76029a57119-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.568526 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.568127 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjhfv\" (UniqueName: \"kubernetes.io/projected/e40b9af4-afc4-44d6-8e70-b76029a57119-kube-api-access-fjhfv\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.568526 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.568164 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e40b9af4-afc4-44d6-8e70-b76029a57119-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.568526 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.568192 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e40b9af4-afc4-44d6-8e70-b76029a57119-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.568526 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.568224 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e40b9af4-afc4-44d6-8e70-b76029a57119-config-out\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.568796 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.568589 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e40b9af4-afc4-44d6-8e70-b76029a57119-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.568796 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.568683 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e40b9af4-afc4-44d6-8e70-b76029a57119-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.569269 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.569239 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e40b9af4-afc4-44d6-8e70-b76029a57119-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.570830 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.570804 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e40b9af4-afc4-44d6-8e70-b76029a57119-config-out\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.570830 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.570820 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e40b9af4-afc4-44d6-8e70-b76029a57119-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.570990 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.570881 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e40b9af4-afc4-44d6-8e70-b76029a57119-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.570990 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.570979 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e40b9af4-afc4-44d6-8e70-b76029a57119-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.571124 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.571102 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e40b9af4-afc4-44d6-8e70-b76029a57119-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.571191 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.571175 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e40b9af4-afc4-44d6-8e70-b76029a57119-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.571927 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.571852 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e40b9af4-afc4-44d6-8e70-b76029a57119-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.571927 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.571881 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e40b9af4-afc4-44d6-8e70-b76029a57119-config\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.572172 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.572150 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e40b9af4-afc4-44d6-8e70-b76029a57119-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.572794 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.572774 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e40b9af4-afc4-44d6-8e70-b76029a57119-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.573592 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.573568 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e40b9af4-afc4-44d6-8e70-b76029a57119-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.573647 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.573632 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e40b9af4-afc4-44d6-8e70-b76029a57119-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.573960 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.573946 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e40b9af4-afc4-44d6-8e70-b76029a57119-web-config\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.574391 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.574374 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e40b9af4-afc4-44d6-8e70-b76029a57119-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.577809 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.577789 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjhfv\" (UniqueName: \"kubernetes.io/projected/e40b9af4-afc4-44d6-8e70-b76029a57119-kube-api-access-fjhfv\") pod \"prometheus-k8s-0\" (UID: \"e40b9af4-afc4-44d6-8e70-b76029a57119\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.641828 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.641784 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:55.771849 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:55.771803 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:01:55.775065 ip-10-0-141-131 kubenswrapper[2575]: W0416 14:01:55.775036 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode40b9af4_afc4_44d6_8e70_b76029a57119.slice/crio-86b7b51e2e0eb48eff5252bb2d3ae7534e552373a935e923d264f00adfaaf3c6 WatchSource:0}: Error finding container 86b7b51e2e0eb48eff5252bb2d3ae7534e552373a935e923d264f00adfaaf3c6: Status 404 returned error can't find the container with id 86b7b51e2e0eb48eff5252bb2d3ae7534e552373a935e923d264f00adfaaf3c6 Apr 16 14:01:56.229330 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:56.229290 2575 generic.go:358] "Generic (PLEG): container finished" podID="e40b9af4-afc4-44d6-8e70-b76029a57119" containerID="cb08c97b6552c593467aec6324fa193c7a23a343f1a368e77eb5b827928015e9" exitCode=0 Apr 16 14:01:56.229502 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:56.229380 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e40b9af4-afc4-44d6-8e70-b76029a57119","Type":"ContainerDied","Data":"cb08c97b6552c593467aec6324fa193c7a23a343f1a368e77eb5b827928015e9"} Apr 16 14:01:56.229502 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:56.229419 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e40b9af4-afc4-44d6-8e70-b76029a57119","Type":"ContainerStarted","Data":"86b7b51e2e0eb48eff5252bb2d3ae7534e552373a935e923d264f00adfaaf3c6"} Apr 16 14:01:56.392836 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:56.392800 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d52cfce3-aa81-49f2-b4a3-fb9a60960e2a" path="/var/lib/kubelet/pods/d52cfce3-aa81-49f2-b4a3-fb9a60960e2a/volumes" Apr 16 14:01:57.237622 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:57.237588 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e40b9af4-afc4-44d6-8e70-b76029a57119","Type":"ContainerStarted","Data":"a86df25ee548303291aa86128d892b47ae867068a2a057b5c8f1a9394d50906a"} Apr 16 14:01:57.238093 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:57.237628 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e40b9af4-afc4-44d6-8e70-b76029a57119","Type":"ContainerStarted","Data":"bd4359e6393dff28a837e7499de3308d7bdfb8daa989b7fe52548455980c9ebc"} Apr 16 14:01:57.238093 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:57.237645 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e40b9af4-afc4-44d6-8e70-b76029a57119","Type":"ContainerStarted","Data":"a2af9e29b9ee7f2b49760a1ecff8a314d68bc76af433e27c589c04c48e56b59d"} Apr 16 14:01:57.238093 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:57.237657 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e40b9af4-afc4-44d6-8e70-b76029a57119","Type":"ContainerStarted","Data":"4313349daeaed520377dec449e58045aa04e4fca09bf57d4e0df6352cf4963ef"} Apr 16 14:01:57.238093 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:57.237668 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e40b9af4-afc4-44d6-8e70-b76029a57119","Type":"ContainerStarted","Data":"8c5d741f4a84d508d6cfd6a6ae67be19543c761b3531dfe2fde0108b5fa60d8a"} Apr 16 14:01:57.238093 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:57.237678 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e40b9af4-afc4-44d6-8e70-b76029a57119","Type":"ContainerStarted","Data":"2e071f7915617918e950ca0577c53d542b4cc02518369d40ec960aad3aa3c5a2"} Apr 16 14:01:57.268294 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:01:57.268243 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.268226513 podStartE2EDuration="2.268226513s" podCreationTimestamp="2026-04-16 14:01:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:01:57.266786774 +0000 UTC m=+291.445035642" watchObservedRunningTime="2026-04-16 14:01:57.268226513 +0000 UTC m=+291.446475389" Apr 16 14:02:00.642713 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:02:00.642663 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:02.219095 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:02:02.219061 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-g5lb6" Apr 16 14:02:06.281197 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:02:06.281171 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zxb7l_24046d5b-b6df-4005-85d5-01cafc82cc40/console-operator/2.log" Apr 16 14:02:06.281623 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:02:06.281602 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zxb7l_24046d5b-b6df-4005-85d5-01cafc82cc40/console-operator/2.log" Apr 16 14:02:55.642858 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:02:55.642760 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:55.658254 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:02:55.658224 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:56.419918 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:02:56.419891 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:07:06.301819 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:07:06.301786 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zxb7l_24046d5b-b6df-4005-85d5-01cafc82cc40/console-operator/2.log" Apr 16 14:07:06.303577 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:07:06.303551 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zxb7l_24046d5b-b6df-4005-85d5-01cafc82cc40/console-operator/2.log" Apr 16 14:07:55.552691 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:07:55.552656 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-8xspw"] Apr 16 14:07:55.554874 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:07:55.554852 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-8xspw" Apr 16 14:07:55.557169 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:07:55.557139 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 14:07:55.557275 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:07:55.557139 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 14:07:55.557896 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:07:55.557877 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-m5pt5\"" Apr 16 14:07:55.557953 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:07:55.557877 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 14:07:55.563193 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:07:55.563165 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-8xspw"] Apr 16 14:07:55.646468 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:07:55.646437 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dw5s\" (UniqueName: \"kubernetes.io/projected/75935f8c-8b9f-42fa-b199-a53244c5faff-kube-api-access-6dw5s\") pod \"s3-init-8xspw\" (UID: \"75935f8c-8b9f-42fa-b199-a53244c5faff\") " pod="kserve/s3-init-8xspw" Apr 16 14:07:55.747845 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:07:55.747800 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dw5s\" (UniqueName: \"kubernetes.io/projected/75935f8c-8b9f-42fa-b199-a53244c5faff-kube-api-access-6dw5s\") pod \"s3-init-8xspw\" (UID: \"75935f8c-8b9f-42fa-b199-a53244c5faff\") " pod="kserve/s3-init-8xspw" Apr 16 14:07:55.756649 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:07:55.756619 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dw5s\" (UniqueName: \"kubernetes.io/projected/75935f8c-8b9f-42fa-b199-a53244c5faff-kube-api-access-6dw5s\") pod \"s3-init-8xspw\" (UID: \"75935f8c-8b9f-42fa-b199-a53244c5faff\") " pod="kserve/s3-init-8xspw" Apr 16 14:07:55.877532 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:07:55.877488 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-8xspw" Apr 16 14:07:56.000860 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:07:56.000832 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-8xspw"] Apr 16 14:07:56.003319 ip-10-0-141-131 kubenswrapper[2575]: W0416 14:07:56.003289 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75935f8c_8b9f_42fa_b199_a53244c5faff.slice/crio-c832b620059d26f065f09f74b2f97c2d98f751b88fa8787c15d1158a72a02268 WatchSource:0}: Error finding container c832b620059d26f065f09f74b2f97c2d98f751b88fa8787c15d1158a72a02268: Status 404 returned error can't find the container with id c832b620059d26f065f09f74b2f97c2d98f751b88fa8787c15d1158a72a02268 Apr 16 14:07:56.005060 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:07:56.005040 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:07:56.277172 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:07:56.277087 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-8xspw" event={"ID":"75935f8c-8b9f-42fa-b199-a53244c5faff","Type":"ContainerStarted","Data":"c832b620059d26f065f09f74b2f97c2d98f751b88fa8787c15d1158a72a02268"} Apr 16 14:08:01.294408 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:01.294370 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-8xspw" event={"ID":"75935f8c-8b9f-42fa-b199-a53244c5faff","Type":"ContainerStarted","Data":"78f65c6f1e33bc88167ac9ad7ce1d9a4cc0b835709a075cad91b8491c6d45ebe"} Apr 16 14:08:01.311586 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:01.311496 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-8xspw" podStartSLOduration=1.901091289 podStartE2EDuration="6.31147551s" podCreationTimestamp="2026-04-16 14:07:55 +0000 UTC" firstStartedPulling="2026-04-16 14:07:56.005197818 +0000 UTC m=+650.183446664" lastFinishedPulling="2026-04-16 14:08:00.415582039 +0000 UTC m=+654.593830885" observedRunningTime="2026-04-16 14:08:01.310836225 +0000 UTC m=+655.489085104" watchObservedRunningTime="2026-04-16 14:08:01.31147551 +0000 UTC m=+655.489724397" Apr 16 14:08:04.303026 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:04.302993 2575 generic.go:358] "Generic (PLEG): container finished" podID="75935f8c-8b9f-42fa-b199-a53244c5faff" containerID="78f65c6f1e33bc88167ac9ad7ce1d9a4cc0b835709a075cad91b8491c6d45ebe" exitCode=0 Apr 16 14:08:04.303408 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:04.303075 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-8xspw" event={"ID":"75935f8c-8b9f-42fa-b199-a53244c5faff","Type":"ContainerDied","Data":"78f65c6f1e33bc88167ac9ad7ce1d9a4cc0b835709a075cad91b8491c6d45ebe"} Apr 16 14:08:05.429322 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:05.429296 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-8xspw" Apr 16 14:08:05.532672 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:05.532636 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dw5s\" (UniqueName: \"kubernetes.io/projected/75935f8c-8b9f-42fa-b199-a53244c5faff-kube-api-access-6dw5s\") pod \"75935f8c-8b9f-42fa-b199-a53244c5faff\" (UID: \"75935f8c-8b9f-42fa-b199-a53244c5faff\") " Apr 16 14:08:05.535144 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:05.535117 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75935f8c-8b9f-42fa-b199-a53244c5faff-kube-api-access-6dw5s" (OuterVolumeSpecName: "kube-api-access-6dw5s") pod "75935f8c-8b9f-42fa-b199-a53244c5faff" (UID: "75935f8c-8b9f-42fa-b199-a53244c5faff"). InnerVolumeSpecName "kube-api-access-6dw5s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:08:05.634268 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:05.634221 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6dw5s\" (UniqueName: \"kubernetes.io/projected/75935f8c-8b9f-42fa-b199-a53244c5faff-kube-api-access-6dw5s\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:08:06.312459 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:06.312422 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-8xspw" event={"ID":"75935f8c-8b9f-42fa-b199-a53244c5faff","Type":"ContainerDied","Data":"c832b620059d26f065f09f74b2f97c2d98f751b88fa8787c15d1158a72a02268"} Apr 16 14:08:06.312459 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:06.312449 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-8xspw" Apr 16 14:08:06.312459 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:06.312458 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c832b620059d26f065f09f74b2f97c2d98f751b88fa8787c15d1158a72a02268" Apr 16 14:08:15.756437 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:15.756405 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd"] Apr 16 14:08:15.756904 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:15.756703 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75935f8c-8b9f-42fa-b199-a53244c5faff" containerName="s3-init" Apr 16 14:08:15.756904 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:15.756713 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="75935f8c-8b9f-42fa-b199-a53244c5faff" containerName="s3-init" Apr 16 14:08:15.756904 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:15.756799 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="75935f8c-8b9f-42fa-b199-a53244c5faff" containerName="s3-init" Apr 16 14:08:15.759918 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:15.759899 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" Apr 16 14:08:15.762355 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:15.762332 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-9pgb7\"" Apr 16 14:08:15.767350 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:15.767308 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd"] Apr 16 14:08:15.817338 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:15.817294 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ec80429-b3eb-45df-abe8-084948810587-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd\" (UID: \"2ec80429-b3eb-45df-abe8-084948810587\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" Apr 16 14:08:15.918548 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:15.918507 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ec80429-b3eb-45df-abe8-084948810587-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd\" (UID: \"2ec80429-b3eb-45df-abe8-084948810587\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" Apr 16 14:08:15.918908 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:15.918888 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ec80429-b3eb-45df-abe8-084948810587-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd\" (UID: \"2ec80429-b3eb-45df-abe8-084948810587\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" Apr 16 14:08:16.071387 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:16.071298 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" Apr 16 14:08:16.192950 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:16.192925 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd"] Apr 16 14:08:16.195205 ip-10-0-141-131 kubenswrapper[2575]: W0416 14:08:16.195176 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ec80429_b3eb_45df_abe8_084948810587.slice/crio-850b45268861827fe7d483e380f638eb744c30189b0f71e72f060ac740c5fc85 WatchSource:0}: Error finding container 850b45268861827fe7d483e380f638eb744c30189b0f71e72f060ac740c5fc85: Status 404 returned error can't find the container with id 850b45268861827fe7d483e380f638eb744c30189b0f71e72f060ac740c5fc85 Apr 16 14:08:16.343031 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:16.342950 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" event={"ID":"2ec80429-b3eb-45df-abe8-084948810587","Type":"ContainerStarted","Data":"850b45268861827fe7d483e380f638eb744c30189b0f71e72f060ac740c5fc85"} Apr 16 14:08:21.361316 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:21.361280 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" event={"ID":"2ec80429-b3eb-45df-abe8-084948810587","Type":"ContainerStarted","Data":"a838b5cb56df6db377b4817f7225fc275a5bf6a0d9105dc57d8402559f974059"} Apr 16 14:08:25.375062 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:25.375025 2575 generic.go:358] "Generic (PLEG): container finished" podID="2ec80429-b3eb-45df-abe8-084948810587" containerID="a838b5cb56df6db377b4817f7225fc275a5bf6a0d9105dc57d8402559f974059" exitCode=0 Apr 16 14:08:25.375440 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:25.375087 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" event={"ID":"2ec80429-b3eb-45df-abe8-084948810587","Type":"ContainerDied","Data":"a838b5cb56df6db377b4817f7225fc275a5bf6a0d9105dc57d8402559f974059"} Apr 16 14:08:39.430491 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:39.430452 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" event={"ID":"2ec80429-b3eb-45df-abe8-084948810587","Type":"ContainerStarted","Data":"ac746b312deaf5ccd6e05977efb9ed5ffc511b09e910914278658d83a11477b4"} Apr 16 14:08:42.441656 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:42.441619 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" event={"ID":"2ec80429-b3eb-45df-abe8-084948810587","Type":"ContainerStarted","Data":"797b91c4c6ab58ecb090af30a06676b3fa8faa42d31e513ccd2d490ff86129be"} Apr 16 14:08:42.442064 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:42.441909 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" Apr 16 14:08:42.443409 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:42.443361 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 16 14:08:42.464001 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:42.463949 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" podStartSLOduration=2.052451235 podStartE2EDuration="27.463933712s" podCreationTimestamp="2026-04-16 14:08:15 +0000 UTC" firstStartedPulling="2026-04-16 14:08:16.197085946 +0000 UTC m=+670.375334792" lastFinishedPulling="2026-04-16 14:08:41.608568418 +0000 UTC m=+695.786817269" observedRunningTime="2026-04-16 14:08:42.462119012 +0000 UTC m=+696.640367881" watchObservedRunningTime="2026-04-16 14:08:42.463933712 +0000 UTC m=+696.642182580" Apr 16 14:08:43.444785 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:43.444754 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" Apr 16 14:08:43.445229 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:43.444861 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 16 14:08:43.445690 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:43.445655 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:08:44.448116 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:44.448069 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 16 14:08:44.448560 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:44.448534 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:08:54.449188 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:54.449138 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 16 14:08:54.449679 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:08:54.449494 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:09:04.448962 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:04.448846 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 16 14:09:04.449360 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:04.449337 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:09:14.449052 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:14.449007 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 16 14:09:14.449536 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:14.449512 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:09:24.449070 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:24.449025 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 16 14:09:24.449542 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:24.449499 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:09:34.448582 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:34.448532 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 16 14:09:34.449145 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:34.449014 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:09:44.448960 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:44.448923 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" Apr 16 14:09:44.449334 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:44.449049 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" Apr 16 14:09:50.954371 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:50.954342 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd"] Apr 16 14:09:50.954783 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:50.954635 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="kserve-container" containerID="cri-o://ac746b312deaf5ccd6e05977efb9ed5ffc511b09e910914278658d83a11477b4" gracePeriod=30 Apr 16 14:09:50.954783 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:50.954741 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="agent" containerID="cri-o://797b91c4c6ab58ecb090af30a06676b3fa8faa42d31e513ccd2d490ff86129be" gracePeriod=30 Apr 16 14:09:51.055369 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:51.055333 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q"] Apr 16 14:09:51.059558 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:51.059524 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q" Apr 16 14:09:51.080484 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:51.080450 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q"] Apr 16 14:09:51.128886 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:51.128851 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4bc59a57-592a-427d-b4fb-53fad0ca5032-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q\" (UID: \"4bc59a57-592a-427d-b4fb-53fad0ca5032\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q" Apr 16 14:09:51.157297 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:51.157253 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v"] Apr 16 14:09:51.159609 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:51.159578 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v" Apr 16 14:09:51.181919 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:51.181883 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v"] Apr 16 14:09:51.229629 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:51.229540 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88fe0cd5-1e65-4c38-b7b0-a35284232513-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v\" (UID: \"88fe0cd5-1e65-4c38-b7b0-a35284232513\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v" Apr 16 14:09:51.229629 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:51.229592 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4bc59a57-592a-427d-b4fb-53fad0ca5032-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q\" (UID: \"4bc59a57-592a-427d-b4fb-53fad0ca5032\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q" Apr 16 14:09:51.230024 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:51.230001 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4bc59a57-592a-427d-b4fb-53fad0ca5032-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q\" (UID: \"4bc59a57-592a-427d-b4fb-53fad0ca5032\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q" Apr 16 14:09:51.330488 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:51.330439 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88fe0cd5-1e65-4c38-b7b0-a35284232513-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v\" (UID: \"88fe0cd5-1e65-4c38-b7b0-a35284232513\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v" Apr 16 14:09:51.330844 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:51.330822 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88fe0cd5-1e65-4c38-b7b0-a35284232513-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v\" (UID: \"88fe0cd5-1e65-4c38-b7b0-a35284232513\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v" Apr 16 14:09:51.371520 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:51.371478 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q" Apr 16 14:09:51.470764 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:51.470715 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v" Apr 16 14:09:51.498802 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:51.498582 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q"] Apr 16 14:09:51.502007 ip-10-0-141-131 kubenswrapper[2575]: W0416 14:09:51.501962 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bc59a57_592a_427d_b4fb_53fad0ca5032.slice/crio-13e89ca403af49596fb10383131b179d3ec5d9160c7975bdd501bd1afd87292a WatchSource:0}: Error finding container 13e89ca403af49596fb10383131b179d3ec5d9160c7975bdd501bd1afd87292a: Status 404 returned error can't find the container with id 13e89ca403af49596fb10383131b179d3ec5d9160c7975bdd501bd1afd87292a Apr 16 14:09:51.613258 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:51.613214 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v"] Apr 16 14:09:51.617195 ip-10-0-141-131 kubenswrapper[2575]: W0416 14:09:51.617167 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88fe0cd5_1e65_4c38_b7b0_a35284232513.slice/crio-6f214e75e36dfd5b5845f6135cdf483a7f62e16fa1263f3b808dad8a867aced6 WatchSource:0}: Error finding container 6f214e75e36dfd5b5845f6135cdf483a7f62e16fa1263f3b808dad8a867aced6: Status 404 returned error can't find the container with id 6f214e75e36dfd5b5845f6135cdf483a7f62e16fa1263f3b808dad8a867aced6 Apr 16 14:09:51.644509 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:51.644477 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v" event={"ID":"88fe0cd5-1e65-4c38-b7b0-a35284232513","Type":"ContainerStarted","Data":"6f214e75e36dfd5b5845f6135cdf483a7f62e16fa1263f3b808dad8a867aced6"} Apr 16 14:09:51.645836 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:51.645805 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q" event={"ID":"4bc59a57-592a-427d-b4fb-53fad0ca5032","Type":"ContainerStarted","Data":"698fa5e1c30bdcfba04ef349a4000b9a50a2d735df629a3911497f030b804087"} Apr 16 14:09:51.645836 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:51.645839 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q" event={"ID":"4bc59a57-592a-427d-b4fb-53fad0ca5032","Type":"ContainerStarted","Data":"13e89ca403af49596fb10383131b179d3ec5d9160c7975bdd501bd1afd87292a"} Apr 16 14:09:52.650783 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:52.650748 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v" event={"ID":"88fe0cd5-1e65-4c38-b7b0-a35284232513","Type":"ContainerStarted","Data":"892d76593b1ca585fca2bdf4d0eca50dddb4093af70a0b03b10a09ab04c8a806"} Apr 16 14:09:54.449138 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:54.449090 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 16 14:09:54.449556 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:54.449431 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:09:55.664011 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:55.663915 2575 generic.go:358] "Generic (PLEG): container finished" podID="88fe0cd5-1e65-4c38-b7b0-a35284232513" containerID="892d76593b1ca585fca2bdf4d0eca50dddb4093af70a0b03b10a09ab04c8a806" exitCode=0 Apr 16 14:09:55.664011 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:55.663987 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v" event={"ID":"88fe0cd5-1e65-4c38-b7b0-a35284232513","Type":"ContainerDied","Data":"892d76593b1ca585fca2bdf4d0eca50dddb4093af70a0b03b10a09ab04c8a806"} Apr 16 14:09:55.665897 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:55.665813 2575 generic.go:358] "Generic (PLEG): container finished" podID="2ec80429-b3eb-45df-abe8-084948810587" containerID="ac746b312deaf5ccd6e05977efb9ed5ffc511b09e910914278658d83a11477b4" exitCode=0 Apr 16 14:09:55.665897 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:55.665888 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" event={"ID":"2ec80429-b3eb-45df-abe8-084948810587","Type":"ContainerDied","Data":"ac746b312deaf5ccd6e05977efb9ed5ffc511b09e910914278658d83a11477b4"} Apr 16 14:09:55.667155 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:55.667132 2575 generic.go:358] "Generic (PLEG): container finished" podID="4bc59a57-592a-427d-b4fb-53fad0ca5032" containerID="698fa5e1c30bdcfba04ef349a4000b9a50a2d735df629a3911497f030b804087" exitCode=0 Apr 16 14:09:55.667239 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:55.667170 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q" event={"ID":"4bc59a57-592a-427d-b4fb-53fad0ca5032","Type":"ContainerDied","Data":"698fa5e1c30bdcfba04ef349a4000b9a50a2d735df629a3911497f030b804087"} Apr 16 14:09:56.673761 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:56.673693 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q" event={"ID":"4bc59a57-592a-427d-b4fb-53fad0ca5032","Type":"ContainerStarted","Data":"1d600f872c34370998538125ed2451c95745d2f8931c9af751844336e975597d"} Apr 16 14:09:56.674391 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:56.674184 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q" Apr 16 14:09:56.675849 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:56.675813 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q" podUID="4bc59a57-592a-427d-b4fb-53fad0ca5032" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 14:09:57.677577 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:09:57.677540 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q" podUID="4bc59a57-592a-427d-b4fb-53fad0ca5032" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 14:10:04.449060 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:04.449007 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 16 14:10:04.449540 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:04.449429 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:10:07.678110 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:07.678050 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q" podUID="4bc59a57-592a-427d-b4fb-53fad0ca5032" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 14:10:14.449108 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:14.449054 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 16 14:10:14.449565 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:14.449218 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" Apr 16 14:10:14.449565 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:14.449441 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:10:14.449682 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:14.449568 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" Apr 16 14:10:14.467273 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:14.467222 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q" podStartSLOduration=23.467208385 podStartE2EDuration="23.467208385s" podCreationTimestamp="2026-04-16 14:09:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:09:56.690928491 +0000 UTC m=+770.869177361" watchObservedRunningTime="2026-04-16 14:10:14.467208385 +0000 UTC m=+788.645457252" Apr 16 14:10:16.743319 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:16.743285 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v" event={"ID":"88fe0cd5-1e65-4c38-b7b0-a35284232513","Type":"ContainerStarted","Data":"9a4288c5babf548461f884c8126bc7132437e2d3284c35d7a233c4d3a2621e1c"} Apr 16 14:10:16.743792 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:16.743575 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v" Apr 16 14:10:16.745002 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:16.744976 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v" podUID="88fe0cd5-1e65-4c38-b7b0-a35284232513" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 16 14:10:16.760208 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:16.760150 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v" podStartSLOduration=5.088550647 podStartE2EDuration="25.760134795s" podCreationTimestamp="2026-04-16 14:09:51 +0000 UTC" firstStartedPulling="2026-04-16 14:09:55.665335356 +0000 UTC m=+769.843584201" lastFinishedPulling="2026-04-16 14:10:16.336919486 +0000 UTC m=+790.515168349" observedRunningTime="2026-04-16 14:10:16.758196492 +0000 UTC m=+790.936445360" watchObservedRunningTime="2026-04-16 14:10:16.760134795 +0000 UTC m=+790.938383660" Apr 16 14:10:17.677776 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:17.677710 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q" podUID="4bc59a57-592a-427d-b4fb-53fad0ca5032" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 14:10:17.746477 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:17.746440 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v" podUID="88fe0cd5-1e65-4c38-b7b0-a35284232513" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 16 14:10:21.092742 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:21.092707 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" Apr 16 14:10:21.193475 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:21.193435 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ec80429-b3eb-45df-abe8-084948810587-kserve-provision-location\") pod \"2ec80429-b3eb-45df-abe8-084948810587\" (UID: \"2ec80429-b3eb-45df-abe8-084948810587\") " Apr 16 14:10:21.193794 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:21.193770 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ec80429-b3eb-45df-abe8-084948810587-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2ec80429-b3eb-45df-abe8-084948810587" (UID: "2ec80429-b3eb-45df-abe8-084948810587"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:10:21.294555 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:21.294461 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ec80429-b3eb-45df-abe8-084948810587-kserve-provision-location\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:10:21.760200 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:21.760167 2575 generic.go:358] "Generic (PLEG): container finished" podID="2ec80429-b3eb-45df-abe8-084948810587" containerID="797b91c4c6ab58ecb090af30a06676b3fa8faa42d31e513ccd2d490ff86129be" exitCode=0 Apr 16 14:10:21.760370 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:21.760260 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" event={"ID":"2ec80429-b3eb-45df-abe8-084948810587","Type":"ContainerDied","Data":"797b91c4c6ab58ecb090af30a06676b3fa8faa42d31e513ccd2d490ff86129be"} Apr 16 14:10:21.760370 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:21.760296 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" Apr 16 14:10:21.760370 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:21.760313 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd" event={"ID":"2ec80429-b3eb-45df-abe8-084948810587","Type":"ContainerDied","Data":"850b45268861827fe7d483e380f638eb744c30189b0f71e72f060ac740c5fc85"} Apr 16 14:10:21.760370 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:21.760336 2575 scope.go:117] "RemoveContainer" containerID="797b91c4c6ab58ecb090af30a06676b3fa8faa42d31e513ccd2d490ff86129be" Apr 16 14:10:21.768087 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:21.768070 2575 scope.go:117] "RemoveContainer" containerID="ac746b312deaf5ccd6e05977efb9ed5ffc511b09e910914278658d83a11477b4" Apr 16 14:10:21.774937 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:21.774922 2575 scope.go:117] "RemoveContainer" containerID="a838b5cb56df6db377b4817f7225fc275a5bf6a0d9105dc57d8402559f974059" Apr 16 14:10:21.782164 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:21.782143 2575 scope.go:117] "RemoveContainer" containerID="797b91c4c6ab58ecb090af30a06676b3fa8faa42d31e513ccd2d490ff86129be" Apr 16 14:10:21.782423 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:10:21.782405 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"797b91c4c6ab58ecb090af30a06676b3fa8faa42d31e513ccd2d490ff86129be\": container with ID starting with 797b91c4c6ab58ecb090af30a06676b3fa8faa42d31e513ccd2d490ff86129be not found: ID does not exist" containerID="797b91c4c6ab58ecb090af30a06676b3fa8faa42d31e513ccd2d490ff86129be" Apr 16 14:10:21.782495 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:21.782432 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"797b91c4c6ab58ecb090af30a06676b3fa8faa42d31e513ccd2d490ff86129be"} err="failed to get container status \"797b91c4c6ab58ecb090af30a06676b3fa8faa42d31e513ccd2d490ff86129be\": rpc error: code = NotFound desc = could not find container \"797b91c4c6ab58ecb090af30a06676b3fa8faa42d31e513ccd2d490ff86129be\": container with ID starting with 797b91c4c6ab58ecb090af30a06676b3fa8faa42d31e513ccd2d490ff86129be not found: ID does not exist" Apr 16 14:10:21.782495 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:21.782452 2575 scope.go:117] "RemoveContainer" containerID="ac746b312deaf5ccd6e05977efb9ed5ffc511b09e910914278658d83a11477b4" Apr 16 14:10:21.782635 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:21.782616 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd"] Apr 16 14:10:21.782709 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:10:21.782684 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac746b312deaf5ccd6e05977efb9ed5ffc511b09e910914278658d83a11477b4\": container with ID starting with ac746b312deaf5ccd6e05977efb9ed5ffc511b09e910914278658d83a11477b4 not found: ID does not exist" containerID="ac746b312deaf5ccd6e05977efb9ed5ffc511b09e910914278658d83a11477b4" Apr 16 14:10:21.782795 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:21.782747 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac746b312deaf5ccd6e05977efb9ed5ffc511b09e910914278658d83a11477b4"} err="failed to get container status \"ac746b312deaf5ccd6e05977efb9ed5ffc511b09e910914278658d83a11477b4\": rpc error: code = NotFound desc = could not find container \"ac746b312deaf5ccd6e05977efb9ed5ffc511b09e910914278658d83a11477b4\": container with ID starting with ac746b312deaf5ccd6e05977efb9ed5ffc511b09e910914278658d83a11477b4 not found: ID does not exist" Apr 16 14:10:21.782795 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:21.782774 2575 scope.go:117] "RemoveContainer" containerID="a838b5cb56df6db377b4817f7225fc275a5bf6a0d9105dc57d8402559f974059" Apr 16 14:10:21.783010 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:10:21.782991 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a838b5cb56df6db377b4817f7225fc275a5bf6a0d9105dc57d8402559f974059\": container with ID starting with a838b5cb56df6db377b4817f7225fc275a5bf6a0d9105dc57d8402559f974059 not found: ID does not exist" containerID="a838b5cb56df6db377b4817f7225fc275a5bf6a0d9105dc57d8402559f974059" Apr 16 14:10:21.783060 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:21.783016 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a838b5cb56df6db377b4817f7225fc275a5bf6a0d9105dc57d8402559f974059"} err="failed to get container status \"a838b5cb56df6db377b4817f7225fc275a5bf6a0d9105dc57d8402559f974059\": rpc error: code = NotFound desc = could not find container \"a838b5cb56df6db377b4817f7225fc275a5bf6a0d9105dc57d8402559f974059\": container with ID starting with a838b5cb56df6db377b4817f7225fc275a5bf6a0d9105dc57d8402559f974059 not found: ID does not exist" Apr 16 14:10:21.785968 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:21.785941 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-6af14-predictor-848cbf7b6b-2kdkd"] Apr 16 14:10:22.390095 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:22.390055 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ec80429-b3eb-45df-abe8-084948810587" path="/var/lib/kubelet/pods/2ec80429-b3eb-45df-abe8-084948810587/volumes" Apr 16 14:10:27.678043 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:27.677949 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q" podUID="4bc59a57-592a-427d-b4fb-53fad0ca5032" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 14:10:27.747084 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:27.747040 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v" podUID="88fe0cd5-1e65-4c38-b7b0-a35284232513" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 16 14:10:37.677852 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:37.677807 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q" podUID="4bc59a57-592a-427d-b4fb-53fad0ca5032" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 14:10:37.746666 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:37.746626 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v" podUID="88fe0cd5-1e65-4c38-b7b0-a35284232513" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 16 14:10:47.678353 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:47.678307 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q" podUID="4bc59a57-592a-427d-b4fb-53fad0ca5032" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 14:10:47.747200 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:47.747156 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v" podUID="88fe0cd5-1e65-4c38-b7b0-a35284232513" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 16 14:10:57.677698 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:57.677650 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q" podUID="4bc59a57-592a-427d-b4fb-53fad0ca5032" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 14:10:57.746956 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:10:57.746915 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v" podUID="88fe0cd5-1e65-4c38-b7b0-a35284232513" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 16 14:11:04.386817 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:04.386769 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q" podUID="4bc59a57-592a-427d-b4fb-53fad0ca5032" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 14:11:07.746894 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:07.746844 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v" podUID="88fe0cd5-1e65-4c38-b7b0-a35284232513" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 16 14:11:14.389642 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:14.389612 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q" Apr 16 14:11:17.747550 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:17.747520 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v" Apr 16 14:11:41.316334 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.316298 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q"] Apr 16 14:11:41.316862 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.316575 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q" podUID="4bc59a57-592a-427d-b4fb-53fad0ca5032" containerName="kserve-container" containerID="cri-o://1d600f872c34370998538125ed2451c95745d2f8931c9af751844336e975597d" gracePeriod=30 Apr 16 14:11:41.358048 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.358015 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5"] Apr 16 14:11:41.358332 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.358321 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="storage-initializer" Apr 16 14:11:41.358378 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.358334 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="storage-initializer" Apr 16 14:11:41.358378 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.358354 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="kserve-container" Apr 16 14:11:41.358378 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.358360 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="kserve-container" Apr 16 14:11:41.358378 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.358370 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="agent" Apr 16 14:11:41.358378 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.358377 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="agent" Apr 16 14:11:41.358529 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.358427 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="agent" Apr 16 14:11:41.358529 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.358435 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ec80429-b3eb-45df-abe8-084948810587" containerName="kserve-container" Apr 16 14:11:41.361523 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.361505 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5" Apr 16 14:11:41.424102 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.424071 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5"] Apr 16 14:11:41.453850 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.453817 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/78c543cb-924e-400d-a144-85d5eedf964f-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5\" (UID: \"78c543cb-924e-400d-a144-85d5eedf964f\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5" Apr 16 14:11:41.505719 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.505677 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw"] Apr 16 14:11:41.508994 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.508973 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw" Apr 16 14:11:41.518586 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.518558 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw"] Apr 16 14:11:41.554367 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.554326 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/78c543cb-924e-400d-a144-85d5eedf964f-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5\" (UID: \"78c543cb-924e-400d-a144-85d5eedf964f\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5" Apr 16 14:11:41.554812 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.554773 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/78c543cb-924e-400d-a144-85d5eedf964f-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5\" (UID: \"78c543cb-924e-400d-a144-85d5eedf964f\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5" Apr 16 14:11:41.615684 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.615646 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v"] Apr 16 14:11:41.615954 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.615914 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v" podUID="88fe0cd5-1e65-4c38-b7b0-a35284232513" containerName="kserve-container" containerID="cri-o://9a4288c5babf548461f884c8126bc7132437e2d3284c35d7a233c4d3a2621e1c" gracePeriod=30 Apr 16 14:11:41.655364 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.655318 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw\" (UID: \"559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw" Apr 16 14:11:41.673312 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.673278 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5" Apr 16 14:11:41.756764 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.756694 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw\" (UID: \"559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw" Apr 16 14:11:41.757201 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.757173 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw\" (UID: \"559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw" Apr 16 14:11:41.798807 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.798486 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5"] Apr 16 14:11:41.801432 ip-10-0-141-131 kubenswrapper[2575]: W0416 14:11:41.801402 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78c543cb_924e_400d_a144_85d5eedf964f.slice/crio-659184bcb5b16e3590cc7bf18c40e9baf4efd1cc48e2d04f8c320fabcbaee49a WatchSource:0}: Error finding container 659184bcb5b16e3590cc7bf18c40e9baf4efd1cc48e2d04f8c320fabcbaee49a: Status 404 returned error can't find the container with id 659184bcb5b16e3590cc7bf18c40e9baf4efd1cc48e2d04f8c320fabcbaee49a Apr 16 14:11:41.820040 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.820010 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw" Apr 16 14:11:41.955598 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:41.955563 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw"] Apr 16 14:11:42.006809 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:42.006768 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5" event={"ID":"78c543cb-924e-400d-a144-85d5eedf964f","Type":"ContainerStarted","Data":"109c67963f587ba335debf8d9586f4899e39cb02b9e10090f70046768788bbca"} Apr 16 14:11:42.006957 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:42.006820 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5" event={"ID":"78c543cb-924e-400d-a144-85d5eedf964f","Type":"ContainerStarted","Data":"659184bcb5b16e3590cc7bf18c40e9baf4efd1cc48e2d04f8c320fabcbaee49a"} Apr 16 14:11:42.008451 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:42.008427 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw" event={"ID":"559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b","Type":"ContainerStarted","Data":"278c0b4b46fb5b419e2bf779fd7baa98870887d4b38144d841baf2a4a3793cd7"} Apr 16 14:11:43.014586 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:43.014540 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw" event={"ID":"559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b","Type":"ContainerStarted","Data":"e605ee59964c227c21f2e05033c2c46defa1a6bc5a55162c5443f90f44093ec5"} Apr 16 14:11:44.386692 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:44.386642 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q" podUID="4bc59a57-592a-427d-b4fb-53fad0ca5032" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 14:11:45.551305 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:45.551281 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v" Apr 16 14:11:45.589523 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:45.589488 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88fe0cd5-1e65-4c38-b7b0-a35284232513-kserve-provision-location\") pod \"88fe0cd5-1e65-4c38-b7b0-a35284232513\" (UID: \"88fe0cd5-1e65-4c38-b7b0-a35284232513\") " Apr 16 14:11:45.589844 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:45.589818 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88fe0cd5-1e65-4c38-b7b0-a35284232513-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "88fe0cd5-1e65-4c38-b7b0-a35284232513" (UID: "88fe0cd5-1e65-4c38-b7b0-a35284232513"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:11:45.690533 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:45.690498 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88fe0cd5-1e65-4c38-b7b0-a35284232513-kserve-provision-location\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:11:45.940356 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:45.940331 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q" Apr 16 14:11:45.993560 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:45.993526 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4bc59a57-592a-427d-b4fb-53fad0ca5032-kserve-provision-location\") pod \"4bc59a57-592a-427d-b4fb-53fad0ca5032\" (UID: \"4bc59a57-592a-427d-b4fb-53fad0ca5032\") " Apr 16 14:11:45.993911 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:45.993889 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bc59a57-592a-427d-b4fb-53fad0ca5032-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4bc59a57-592a-427d-b4fb-53fad0ca5032" (UID: "4bc59a57-592a-427d-b4fb-53fad0ca5032"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:11:46.026500 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.026465 2575 generic.go:358] "Generic (PLEG): container finished" podID="4bc59a57-592a-427d-b4fb-53fad0ca5032" containerID="1d600f872c34370998538125ed2451c95745d2f8931c9af751844336e975597d" exitCode=0 Apr 16 14:11:46.026677 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.026535 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q" Apr 16 14:11:46.026677 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.026551 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q" event={"ID":"4bc59a57-592a-427d-b4fb-53fad0ca5032","Type":"ContainerDied","Data":"1d600f872c34370998538125ed2451c95745d2f8931c9af751844336e975597d"} Apr 16 14:11:46.026677 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.026597 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q" event={"ID":"4bc59a57-592a-427d-b4fb-53fad0ca5032","Type":"ContainerDied","Data":"13e89ca403af49596fb10383131b179d3ec5d9160c7975bdd501bd1afd87292a"} Apr 16 14:11:46.026677 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.026618 2575 scope.go:117] "RemoveContainer" containerID="1d600f872c34370998538125ed2451c95745d2f8931c9af751844336e975597d" Apr 16 14:11:46.027980 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.027959 2575 generic.go:358] "Generic (PLEG): container finished" podID="559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b" containerID="e605ee59964c227c21f2e05033c2c46defa1a6bc5a55162c5443f90f44093ec5" exitCode=0 Apr 16 14:11:46.028096 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.028032 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw" event={"ID":"559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b","Type":"ContainerDied","Data":"e605ee59964c227c21f2e05033c2c46defa1a6bc5a55162c5443f90f44093ec5"} Apr 16 14:11:46.029548 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.029521 2575 generic.go:358] "Generic (PLEG): container finished" podID="88fe0cd5-1e65-4c38-b7b0-a35284232513" containerID="9a4288c5babf548461f884c8126bc7132437e2d3284c35d7a233c4d3a2621e1c" exitCode=0 Apr 16 14:11:46.029641 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.029598 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v" Apr 16 14:11:46.029641 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.029609 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v" event={"ID":"88fe0cd5-1e65-4c38-b7b0-a35284232513","Type":"ContainerDied","Data":"9a4288c5babf548461f884c8126bc7132437e2d3284c35d7a233c4d3a2621e1c"} Apr 16 14:11:46.029776 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.029640 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v" event={"ID":"88fe0cd5-1e65-4c38-b7b0-a35284232513","Type":"ContainerDied","Data":"6f214e75e36dfd5b5845f6135cdf483a7f62e16fa1263f3b808dad8a867aced6"} Apr 16 14:11:46.031196 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.031176 2575 generic.go:358] "Generic (PLEG): container finished" podID="78c543cb-924e-400d-a144-85d5eedf964f" containerID="109c67963f587ba335debf8d9586f4899e39cb02b9e10090f70046768788bbca" exitCode=0 Apr 16 14:11:46.031285 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.031212 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5" event={"ID":"78c543cb-924e-400d-a144-85d5eedf964f","Type":"ContainerDied","Data":"109c67963f587ba335debf8d9586f4899e39cb02b9e10090f70046768788bbca"} Apr 16 14:11:46.035363 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.035344 2575 scope.go:117] "RemoveContainer" containerID="698fa5e1c30bdcfba04ef349a4000b9a50a2d735df629a3911497f030b804087" Apr 16 14:11:46.043357 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.043342 2575 scope.go:117] "RemoveContainer" containerID="1d600f872c34370998538125ed2451c95745d2f8931c9af751844336e975597d" Apr 16 14:11:46.043617 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:11:46.043595 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d600f872c34370998538125ed2451c95745d2f8931c9af751844336e975597d\": container with ID starting with 1d600f872c34370998538125ed2451c95745d2f8931c9af751844336e975597d not found: ID does not exist" containerID="1d600f872c34370998538125ed2451c95745d2f8931c9af751844336e975597d" Apr 16 14:11:46.043670 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.043631 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d600f872c34370998538125ed2451c95745d2f8931c9af751844336e975597d"} err="failed to get container status \"1d600f872c34370998538125ed2451c95745d2f8931c9af751844336e975597d\": rpc error: code = NotFound desc = could not find container \"1d600f872c34370998538125ed2451c95745d2f8931c9af751844336e975597d\": container with ID starting with 1d600f872c34370998538125ed2451c95745d2f8931c9af751844336e975597d not found: ID does not exist" Apr 16 14:11:46.043670 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.043656 2575 scope.go:117] "RemoveContainer" containerID="698fa5e1c30bdcfba04ef349a4000b9a50a2d735df629a3911497f030b804087" Apr 16 14:11:46.043938 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:11:46.043919 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"698fa5e1c30bdcfba04ef349a4000b9a50a2d735df629a3911497f030b804087\": container with ID starting with 698fa5e1c30bdcfba04ef349a4000b9a50a2d735df629a3911497f030b804087 not found: ID does not exist" containerID="698fa5e1c30bdcfba04ef349a4000b9a50a2d735df629a3911497f030b804087" Apr 16 14:11:46.043998 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.043943 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"698fa5e1c30bdcfba04ef349a4000b9a50a2d735df629a3911497f030b804087"} err="failed to get container status \"698fa5e1c30bdcfba04ef349a4000b9a50a2d735df629a3911497f030b804087\": rpc error: code = NotFound desc = could not find container \"698fa5e1c30bdcfba04ef349a4000b9a50a2d735df629a3911497f030b804087\": container with ID starting with 698fa5e1c30bdcfba04ef349a4000b9a50a2d735df629a3911497f030b804087 not found: ID does not exist" Apr 16 14:11:46.043998 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.043959 2575 scope.go:117] "RemoveContainer" containerID="9a4288c5babf548461f884c8126bc7132437e2d3284c35d7a233c4d3a2621e1c" Apr 16 14:11:46.052358 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.052336 2575 scope.go:117] "RemoveContainer" containerID="892d76593b1ca585fca2bdf4d0eca50dddb4093af70a0b03b10a09ab04c8a806" Apr 16 14:11:46.061683 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.061636 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q"] Apr 16 14:11:46.063759 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.063740 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-bd7db-predictor-fd55b8664-sg94q"] Apr 16 14:11:46.069915 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.069897 2575 scope.go:117] "RemoveContainer" containerID="9a4288c5babf548461f884c8126bc7132437e2d3284c35d7a233c4d3a2621e1c" Apr 16 14:11:46.070196 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:11:46.070176 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a4288c5babf548461f884c8126bc7132437e2d3284c35d7a233c4d3a2621e1c\": container with ID starting with 9a4288c5babf548461f884c8126bc7132437e2d3284c35d7a233c4d3a2621e1c not found: ID does not exist" containerID="9a4288c5babf548461f884c8126bc7132437e2d3284c35d7a233c4d3a2621e1c" Apr 16 14:11:46.070254 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.070203 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a4288c5babf548461f884c8126bc7132437e2d3284c35d7a233c4d3a2621e1c"} err="failed to get container status \"9a4288c5babf548461f884c8126bc7132437e2d3284c35d7a233c4d3a2621e1c\": rpc error: code = NotFound desc = could not find container \"9a4288c5babf548461f884c8126bc7132437e2d3284c35d7a233c4d3a2621e1c\": container with ID starting with 9a4288c5babf548461f884c8126bc7132437e2d3284c35d7a233c4d3a2621e1c not found: ID does not exist" Apr 16 14:11:46.070254 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.070222 2575 scope.go:117] "RemoveContainer" containerID="892d76593b1ca585fca2bdf4d0eca50dddb4093af70a0b03b10a09ab04c8a806" Apr 16 14:11:46.070482 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:11:46.070460 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"892d76593b1ca585fca2bdf4d0eca50dddb4093af70a0b03b10a09ab04c8a806\": container with ID starting with 892d76593b1ca585fca2bdf4d0eca50dddb4093af70a0b03b10a09ab04c8a806 not found: ID does not exist" containerID="892d76593b1ca585fca2bdf4d0eca50dddb4093af70a0b03b10a09ab04c8a806" Apr 16 14:11:46.070530 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.070493 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"892d76593b1ca585fca2bdf4d0eca50dddb4093af70a0b03b10a09ab04c8a806"} err="failed to get container status \"892d76593b1ca585fca2bdf4d0eca50dddb4093af70a0b03b10a09ab04c8a806\": rpc error: code = NotFound desc = could not find container \"892d76593b1ca585fca2bdf4d0eca50dddb4093af70a0b03b10a09ab04c8a806\": container with ID starting with 892d76593b1ca585fca2bdf4d0eca50dddb4093af70a0b03b10a09ab04c8a806 not found: ID does not exist" Apr 16 14:11:46.074498 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.074474 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v"] Apr 16 14:11:46.075985 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.075962 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-bd7db-predictor-556ff547f5-4ms9v"] Apr 16 14:11:46.094125 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.094102 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4bc59a57-592a-427d-b4fb-53fad0ca5032-kserve-provision-location\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:11:46.395769 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.395717 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bc59a57-592a-427d-b4fb-53fad0ca5032" path="/var/lib/kubelet/pods/4bc59a57-592a-427d-b4fb-53fad0ca5032/volumes" Apr 16 14:11:46.396124 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:46.396110 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88fe0cd5-1e65-4c38-b7b0-a35284232513" path="/var/lib/kubelet/pods/88fe0cd5-1e65-4c38-b7b0-a35284232513/volumes" Apr 16 14:11:47.037188 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:47.037155 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5" event={"ID":"78c543cb-924e-400d-a144-85d5eedf964f","Type":"ContainerStarted","Data":"f09802147c3a1f381f91a7c1f37798f704dde8ebdf139d93fbb769b57e50aa56"} Apr 16 14:11:47.037758 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:47.037482 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5" Apr 16 14:11:47.039060 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:47.039017 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5" podUID="78c543cb-924e-400d-a144-85d5eedf964f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 14:11:47.039709 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:47.039688 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw" event={"ID":"559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b","Type":"ContainerStarted","Data":"eaa2bf15188c0012fa8e410b1da73731f73c06f808bbaf75d7c95077413db677"} Apr 16 14:11:47.040024 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:47.040001 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw" Apr 16 14:11:47.041000 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:47.040978 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw" podUID="559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 14:11:47.063303 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:47.063241 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5" podStartSLOduration=6.063227543 podStartE2EDuration="6.063227543s" podCreationTimestamp="2026-04-16 14:11:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:11:47.058755387 +0000 UTC m=+881.237004252" watchObservedRunningTime="2026-04-16 14:11:47.063227543 +0000 UTC m=+881.241476411" Apr 16 14:11:47.081532 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:47.081473 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw" podStartSLOduration=6.081458986 podStartE2EDuration="6.081458986s" podCreationTimestamp="2026-04-16 14:11:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:11:47.079427216 +0000 UTC m=+881.257676084" watchObservedRunningTime="2026-04-16 14:11:47.081458986 +0000 UTC m=+881.259707924" Apr 16 14:11:48.044337 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:48.044291 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw" podUID="559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 14:11:48.044701 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:48.044297 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5" podUID="78c543cb-924e-400d-a144-85d5eedf964f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 14:11:58.044888 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:58.044793 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw" podUID="559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 14:11:58.045276 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:11:58.044793 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5" podUID="78c543cb-924e-400d-a144-85d5eedf964f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 14:12:06.323410 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:12:06.323376 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zxb7l_24046d5b-b6df-4005-85d5-01cafc82cc40/console-operator/2.log" Apr 16 14:12:06.325747 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:12:06.325706 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zxb7l_24046d5b-b6df-4005-85d5-01cafc82cc40/console-operator/2.log" Apr 16 14:12:08.044592 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:12:08.044550 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5" podUID="78c543cb-924e-400d-a144-85d5eedf964f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 14:12:08.044996 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:12:08.044550 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw" podUID="559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 14:12:18.045243 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:12:18.045194 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw" podUID="559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 14:12:18.045856 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:12:18.045194 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5" podUID="78c543cb-924e-400d-a144-85d5eedf964f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 14:12:28.044456 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:12:28.044414 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5" podUID="78c543cb-924e-400d-a144-85d5eedf964f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 14:12:28.044871 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:12:28.044416 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw" podUID="559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 14:12:38.045271 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:12:38.045220 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw" podUID="559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 14:12:38.045663 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:12:38.045220 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5" podUID="78c543cb-924e-400d-a144-85d5eedf964f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 14:12:48.045330 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:12:48.045284 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5" podUID="78c543cb-924e-400d-a144-85d5eedf964f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 14:12:48.045891 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:12:48.045870 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw" Apr 16 14:12:58.045140 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:12:58.045107 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5" Apr 16 14:13:21.608197 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:21.608161 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5"] Apr 16 14:13:21.608598 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:21.608505 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5" podUID="78c543cb-924e-400d-a144-85d5eedf964f" containerName="kserve-container" containerID="cri-o://f09802147c3a1f381f91a7c1f37798f704dde8ebdf139d93fbb769b57e50aa56" gracePeriod=30 Apr 16 14:13:21.709587 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:21.709555 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw"] Apr 16 14:13:21.709875 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:21.709851 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw" podUID="559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b" containerName="kserve-container" containerID="cri-o://eaa2bf15188c0012fa8e410b1da73731f73c06f808bbaf75d7c95077413db677" gracePeriod=30 Apr 16 14:13:25.340165 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:25.340132 2575 generic.go:358] "Generic (PLEG): container finished" podID="559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b" containerID="eaa2bf15188c0012fa8e410b1da73731f73c06f808bbaf75d7c95077413db677" exitCode=0 Apr 16 14:13:25.340477 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:25.340200 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw" event={"ID":"559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b","Type":"ContainerDied","Data":"eaa2bf15188c0012fa8e410b1da73731f73c06f808bbaf75d7c95077413db677"} Apr 16 14:13:25.340477 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:25.340245 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw" event={"ID":"559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b","Type":"ContainerDied","Data":"278c0b4b46fb5b419e2bf779fd7baa98870887d4b38144d841baf2a4a3793cd7"} Apr 16 14:13:25.340477 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:25.340256 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="278c0b4b46fb5b419e2bf779fd7baa98870887d4b38144d841baf2a4a3793cd7" Apr 16 14:13:25.350118 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:25.350096 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw" Apr 16 14:13:25.428398 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:25.428324 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b-kserve-provision-location\") pod \"559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b\" (UID: \"559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b\") " Apr 16 14:13:25.428645 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:25.428621 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b" (UID: "559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:13:25.529194 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:25.529155 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b-kserve-provision-location\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:13:26.044293 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:26.044267 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5" Apr 16 14:13:26.133649 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:26.133553 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/78c543cb-924e-400d-a144-85d5eedf964f-kserve-provision-location\") pod \"78c543cb-924e-400d-a144-85d5eedf964f\" (UID: \"78c543cb-924e-400d-a144-85d5eedf964f\") " Apr 16 14:13:26.133921 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:26.133898 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78c543cb-924e-400d-a144-85d5eedf964f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "78c543cb-924e-400d-a144-85d5eedf964f" (UID: "78c543cb-924e-400d-a144-85d5eedf964f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:13:26.234396 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:26.234360 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/78c543cb-924e-400d-a144-85d5eedf964f-kserve-provision-location\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:13:26.344110 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:26.344070 2575 generic.go:358] "Generic (PLEG): container finished" podID="78c543cb-924e-400d-a144-85d5eedf964f" containerID="f09802147c3a1f381f91a7c1f37798f704dde8ebdf139d93fbb769b57e50aa56" exitCode=0 Apr 16 14:13:26.344538 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:26.344184 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5" event={"ID":"78c543cb-924e-400d-a144-85d5eedf964f","Type":"ContainerDied","Data":"f09802147c3a1f381f91a7c1f37798f704dde8ebdf139d93fbb769b57e50aa56"} Apr 16 14:13:26.344538 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:26.344226 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5" event={"ID":"78c543cb-924e-400d-a144-85d5eedf964f","Type":"ContainerDied","Data":"659184bcb5b16e3590cc7bf18c40e9baf4efd1cc48e2d04f8c320fabcbaee49a"} Apr 16 14:13:26.344538 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:26.344228 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5" Apr 16 14:13:26.344538 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:26.344238 2575 scope.go:117] "RemoveContainer" containerID="f09802147c3a1f381f91a7c1f37798f704dde8ebdf139d93fbb769b57e50aa56" Apr 16 14:13:26.344538 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:26.344227 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw" Apr 16 14:13:26.354438 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:26.354231 2575 scope.go:117] "RemoveContainer" containerID="109c67963f587ba335debf8d9586f4899e39cb02b9e10090f70046768788bbca" Apr 16 14:13:26.363200 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:26.363182 2575 scope.go:117] "RemoveContainer" containerID="f09802147c3a1f381f91a7c1f37798f704dde8ebdf139d93fbb769b57e50aa56" Apr 16 14:13:26.363484 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:13:26.363465 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f09802147c3a1f381f91a7c1f37798f704dde8ebdf139d93fbb769b57e50aa56\": container with ID starting with f09802147c3a1f381f91a7c1f37798f704dde8ebdf139d93fbb769b57e50aa56 not found: ID does not exist" containerID="f09802147c3a1f381f91a7c1f37798f704dde8ebdf139d93fbb769b57e50aa56" Apr 16 14:13:26.363531 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:26.363494 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f09802147c3a1f381f91a7c1f37798f704dde8ebdf139d93fbb769b57e50aa56"} err="failed to get container status \"f09802147c3a1f381f91a7c1f37798f704dde8ebdf139d93fbb769b57e50aa56\": rpc error: code = NotFound desc = could not find container \"f09802147c3a1f381f91a7c1f37798f704dde8ebdf139d93fbb769b57e50aa56\": container with ID starting with f09802147c3a1f381f91a7c1f37798f704dde8ebdf139d93fbb769b57e50aa56 not found: ID does not exist" Apr 16 14:13:26.363531 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:26.363518 2575 scope.go:117] "RemoveContainer" containerID="109c67963f587ba335debf8d9586f4899e39cb02b9e10090f70046768788bbca" Apr 16 14:13:26.363789 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:13:26.363765 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"109c67963f587ba335debf8d9586f4899e39cb02b9e10090f70046768788bbca\": container with ID starting with 109c67963f587ba335debf8d9586f4899e39cb02b9e10090f70046768788bbca not found: ID does not exist" containerID="109c67963f587ba335debf8d9586f4899e39cb02b9e10090f70046768788bbca" Apr 16 14:13:26.363883 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:26.363795 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"109c67963f587ba335debf8d9586f4899e39cb02b9e10090f70046768788bbca"} err="failed to get container status \"109c67963f587ba335debf8d9586f4899e39cb02b9e10090f70046768788bbca\": rpc error: code = NotFound desc = could not find container \"109c67963f587ba335debf8d9586f4899e39cb02b9e10090f70046768788bbca\": container with ID starting with 109c67963f587ba335debf8d9586f4899e39cb02b9e10090f70046768788bbca not found: ID does not exist" Apr 16 14:13:26.369134 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:26.369111 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw"] Apr 16 14:13:26.373223 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:26.373197 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-1d91c-predictor-779d987f87-tntcw"] Apr 16 14:13:26.383093 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:26.383069 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5"] Apr 16 14:13:26.394811 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:26.394714 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b" path="/var/lib/kubelet/pods/559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b/volumes" Apr 16 14:13:26.395093 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:26.395074 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-1d91c-predictor-76b8b86b6d-vpcm5"] Apr 16 14:13:28.389859 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:28.389814 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78c543cb-924e-400d-a144-85d5eedf964f" path="/var/lib/kubelet/pods/78c543cb-924e-400d-a144-85d5eedf964f/volumes" Apr 16 14:13:31.656505 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.656467 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg"] Apr 16 14:13:31.656892 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.656800 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88fe0cd5-1e65-4c38-b7b0-a35284232513" containerName="kserve-container" Apr 16 14:13:31.656892 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.656813 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="88fe0cd5-1e65-4c38-b7b0-a35284232513" containerName="kserve-container" Apr 16 14:13:31.656892 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.656828 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4bc59a57-592a-427d-b4fb-53fad0ca5032" containerName="storage-initializer" Apr 16 14:13:31.656892 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.656833 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc59a57-592a-427d-b4fb-53fad0ca5032" containerName="storage-initializer" Apr 16 14:13:31.656892 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.656841 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b" containerName="kserve-container" Apr 16 14:13:31.656892 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.656848 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b" containerName="kserve-container" Apr 16 14:13:31.656892 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.656855 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="78c543cb-924e-400d-a144-85d5eedf964f" containerName="storage-initializer" Apr 16 14:13:31.656892 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.656861 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="78c543cb-924e-400d-a144-85d5eedf964f" containerName="storage-initializer" Apr 16 14:13:31.656892 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.656872 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88fe0cd5-1e65-4c38-b7b0-a35284232513" containerName="storage-initializer" Apr 16 14:13:31.656892 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.656876 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="88fe0cd5-1e65-4c38-b7b0-a35284232513" containerName="storage-initializer" Apr 16 14:13:31.656892 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.656887 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4bc59a57-592a-427d-b4fb-53fad0ca5032" containerName="kserve-container" Apr 16 14:13:31.656892 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.656892 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc59a57-592a-427d-b4fb-53fad0ca5032" containerName="kserve-container" Apr 16 14:13:31.656892 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.656898 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b" containerName="storage-initializer" Apr 16 14:13:31.657262 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.656903 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b" containerName="storage-initializer" Apr 16 14:13:31.657262 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.656909 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="78c543cb-924e-400d-a144-85d5eedf964f" containerName="kserve-container" Apr 16 14:13:31.657262 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.656915 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="78c543cb-924e-400d-a144-85d5eedf964f" containerName="kserve-container" Apr 16 14:13:31.657262 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.656960 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4bc59a57-592a-427d-b4fb-53fad0ca5032" containerName="kserve-container" Apr 16 14:13:31.657262 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.656969 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="78c543cb-924e-400d-a144-85d5eedf964f" containerName="kserve-container" Apr 16 14:13:31.657262 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.656975 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="559b88c7-e9d3-4da4-bb5d-fefe2d5a8e9b" containerName="kserve-container" Apr 16 14:13:31.657262 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.656983 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="88fe0cd5-1e65-4c38-b7b0-a35284232513" containerName="kserve-container" Apr 16 14:13:31.659860 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.659845 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" Apr 16 14:13:31.662077 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.662057 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-9pgb7\"" Apr 16 14:13:31.669330 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.669303 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg"] Apr 16 14:13:31.674939 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.674912 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/710cab95-3f1a-48b7-b29b-1c6b14b8887e-kserve-provision-location\") pod \"isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg\" (UID: \"710cab95-3f1a-48b7-b29b-1c6b14b8887e\") " pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" Apr 16 14:13:31.775955 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.775918 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/710cab95-3f1a-48b7-b29b-1c6b14b8887e-kserve-provision-location\") pod \"isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg\" (UID: \"710cab95-3f1a-48b7-b29b-1c6b14b8887e\") " pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" Apr 16 14:13:31.776276 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.776255 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/710cab95-3f1a-48b7-b29b-1c6b14b8887e-kserve-provision-location\") pod \"isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg\" (UID: \"710cab95-3f1a-48b7-b29b-1c6b14b8887e\") " pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" Apr 16 14:13:31.970173 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:31.970075 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" Apr 16 14:13:32.091586 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:32.091556 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg"] Apr 16 14:13:32.094299 ip-10-0-141-131 kubenswrapper[2575]: W0416 14:13:32.094269 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod710cab95_3f1a_48b7_b29b_1c6b14b8887e.slice/crio-0ed937f0c3cadb403d1bc569934124203c44d7409d495627b096916c6c42494a WatchSource:0}: Error finding container 0ed937f0c3cadb403d1bc569934124203c44d7409d495627b096916c6c42494a: Status 404 returned error can't find the container with id 0ed937f0c3cadb403d1bc569934124203c44d7409d495627b096916c6c42494a Apr 16 14:13:32.096484 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:32.096467 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:13:32.370036 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:32.370002 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" event={"ID":"710cab95-3f1a-48b7-b29b-1c6b14b8887e","Type":"ContainerStarted","Data":"0cbed02265e92915a0f3ae996ef9358b3b6a18a3c350197847e0e49e65069040"} Apr 16 14:13:32.370036 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:32.370039 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" event={"ID":"710cab95-3f1a-48b7-b29b-1c6b14b8887e","Type":"ContainerStarted","Data":"0ed937f0c3cadb403d1bc569934124203c44d7409d495627b096916c6c42494a"} Apr 16 14:13:36.383520 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:36.383487 2575 generic.go:358] "Generic (PLEG): container finished" podID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerID="0cbed02265e92915a0f3ae996ef9358b3b6a18a3c350197847e0e49e65069040" exitCode=0 Apr 16 14:13:36.383923 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:36.383558 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" event={"ID":"710cab95-3f1a-48b7-b29b-1c6b14b8887e","Type":"ContainerDied","Data":"0cbed02265e92915a0f3ae996ef9358b3b6a18a3c350197847e0e49e65069040"} Apr 16 14:13:37.388825 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:37.388789 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" event={"ID":"710cab95-3f1a-48b7-b29b-1c6b14b8887e","Type":"ContainerStarted","Data":"6c611fc419d18c9cd2b247777598e772080a56cdf488bca3a5f7c1a422b96479"} Apr 16 14:13:37.388825 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:37.388832 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" event={"ID":"710cab95-3f1a-48b7-b29b-1c6b14b8887e","Type":"ContainerStarted","Data":"06623c3de46cabe12133c191353b472f4ba5a3154fdf17fc1abe6b236b7c474f"} Apr 16 14:13:37.389241 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:37.389110 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" Apr 16 14:13:37.390426 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:37.390402 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 14:13:37.405422 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:37.405364 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" podStartSLOduration=6.405346426 podStartE2EDuration="6.405346426s" podCreationTimestamp="2026-04-16 14:13:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:13:37.404701106 +0000 UTC m=+991.582949973" watchObservedRunningTime="2026-04-16 14:13:37.405346426 +0000 UTC m=+991.583595296" Apr 16 14:13:38.394526 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:38.394492 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" Apr 16 14:13:38.394928 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:38.394598 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 14:13:38.395525 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:38.395503 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:13:39.397697 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:39.397650 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 14:13:39.398165 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:39.398088 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:13:49.397886 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:49.397835 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 14:13:49.398359 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:49.398335 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:13:59.397993 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:59.397932 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 14:13:59.398415 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:13:59.398341 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:14:09.398678 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:14:09.398630 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 14:14:09.399129 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:14:09.399088 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:14:19.398275 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:14:19.398217 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 14:14:19.398782 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:14:19.398681 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:14:29.397934 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:14:29.397875 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 14:14:29.398447 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:14:29.398264 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:14:39.398033 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:14:39.397984 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 14:14:39.398512 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:14:39.398376 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:14:49.398988 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:14:49.398939 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" Apr 16 14:14:49.399419 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:14:49.399010 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" Apr 16 14:14:56.880389 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:14:56.880305 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg"] Apr 16 14:14:56.880893 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:14:56.880695 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="kserve-container" containerID="cri-o://06623c3de46cabe12133c191353b472f4ba5a3154fdf17fc1abe6b236b7c474f" gracePeriod=30 Apr 16 14:14:56.880893 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:14:56.880848 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="agent" containerID="cri-o://6c611fc419d18c9cd2b247777598e772080a56cdf488bca3a5f7c1a422b96479" gracePeriod=30 Apr 16 14:14:56.912198 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:14:56.912163 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv"] Apr 16 14:14:56.915622 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:14:56.915606 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" Apr 16 14:14:56.922541 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:14:56.922357 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv"] Apr 16 14:14:56.965683 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:14:56.965648 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc9c1d1d-5ded-4e1a-b096-56ce46b255e1-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv\" (UID: \"fc9c1d1d-5ded-4e1a-b096-56ce46b255e1\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" Apr 16 14:14:57.067076 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:14:57.067027 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc9c1d1d-5ded-4e1a-b096-56ce46b255e1-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv\" (UID: \"fc9c1d1d-5ded-4e1a-b096-56ce46b255e1\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" Apr 16 14:14:57.067415 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:14:57.067394 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc9c1d1d-5ded-4e1a-b096-56ce46b255e1-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv\" (UID: \"fc9c1d1d-5ded-4e1a-b096-56ce46b255e1\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" Apr 16 14:14:57.227110 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:14:57.227024 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" Apr 16 14:14:57.343563 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:14:57.343539 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv"] Apr 16 14:14:57.346111 ip-10-0-141-131 kubenswrapper[2575]: W0416 14:14:57.346084 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc9c1d1d_5ded_4e1a_b096_56ce46b255e1.slice/crio-1e48887b706dec5a8bfc74b6c630e720959d5f49fd40b7069f9fe6379ce676c8 WatchSource:0}: Error finding container 1e48887b706dec5a8bfc74b6c630e720959d5f49fd40b7069f9fe6379ce676c8: Status 404 returned error can't find the container with id 1e48887b706dec5a8bfc74b6c630e720959d5f49fd40b7069f9fe6379ce676c8 Apr 16 14:14:57.624098 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:14:57.624066 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" event={"ID":"fc9c1d1d-5ded-4e1a-b096-56ce46b255e1","Type":"ContainerStarted","Data":"82ecb4ebb717d4dd3a38b4e9e9b2bcfedd1dfdc8a9ae085019d64942be2dda61"} Apr 16 14:14:57.624098 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:14:57.624105 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" event={"ID":"fc9c1d1d-5ded-4e1a-b096-56ce46b255e1","Type":"ContainerStarted","Data":"1e48887b706dec5a8bfc74b6c630e720959d5f49fd40b7069f9fe6379ce676c8"} Apr 16 14:14:59.398380 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:14:59.398316 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 14:14:59.398883 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:14:59.398602 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:15:01.641928 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:01.641893 2575 generic.go:358] "Generic (PLEG): container finished" podID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerID="06623c3de46cabe12133c191353b472f4ba5a3154fdf17fc1abe6b236b7c474f" exitCode=0 Apr 16 14:15:01.642357 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:01.641960 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" event={"ID":"710cab95-3f1a-48b7-b29b-1c6b14b8887e","Type":"ContainerDied","Data":"06623c3de46cabe12133c191353b472f4ba5a3154fdf17fc1abe6b236b7c474f"} Apr 16 14:15:01.643270 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:01.643249 2575 generic.go:358] "Generic (PLEG): container finished" podID="fc9c1d1d-5ded-4e1a-b096-56ce46b255e1" containerID="82ecb4ebb717d4dd3a38b4e9e9b2bcfedd1dfdc8a9ae085019d64942be2dda61" exitCode=0 Apr 16 14:15:01.643376 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:01.643307 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" event={"ID":"fc9c1d1d-5ded-4e1a-b096-56ce46b255e1","Type":"ContainerDied","Data":"82ecb4ebb717d4dd3a38b4e9e9b2bcfedd1dfdc8a9ae085019d64942be2dda61"} Apr 16 14:15:02.647825 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:02.647788 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" event={"ID":"fc9c1d1d-5ded-4e1a-b096-56ce46b255e1","Type":"ContainerStarted","Data":"1f16a3b398ba203534bca2ab2184fcc45d44046e7cbd21a529d1adee9ea9341d"} Apr 16 14:15:02.648441 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:02.648127 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" Apr 16 14:15:02.649606 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:02.649581 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" podUID="fc9c1d1d-5ded-4e1a-b096-56ce46b255e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 14:15:02.665281 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:02.665222 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" podStartSLOduration=6.665204615 podStartE2EDuration="6.665204615s" podCreationTimestamp="2026-04-16 14:14:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:15:02.664262962 +0000 UTC m=+1076.842511831" watchObservedRunningTime="2026-04-16 14:15:02.665204615 +0000 UTC m=+1076.843453481" Apr 16 14:15:03.650887 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:03.650847 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" podUID="fc9c1d1d-5ded-4e1a-b096-56ce46b255e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 14:15:09.398678 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:09.398626 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 14:15:09.399170 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:09.398945 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:15:13.651159 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:13.651109 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" podUID="fc9c1d1d-5ded-4e1a-b096-56ce46b255e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 14:15:19.398707 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:19.398641 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 14:15:19.399260 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:19.398828 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" Apr 16 14:15:19.399260 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:19.399043 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:15:19.399260 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:19.399158 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" Apr 16 14:15:23.651674 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:23.651619 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" podUID="fc9c1d1d-5ded-4e1a-b096-56ce46b255e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 14:15:27.524697 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:27.524675 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" Apr 16 14:15:27.628198 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:27.628154 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/710cab95-3f1a-48b7-b29b-1c6b14b8887e-kserve-provision-location\") pod \"710cab95-3f1a-48b7-b29b-1c6b14b8887e\" (UID: \"710cab95-3f1a-48b7-b29b-1c6b14b8887e\") " Apr 16 14:15:27.628482 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:27.628459 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/710cab95-3f1a-48b7-b29b-1c6b14b8887e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "710cab95-3f1a-48b7-b29b-1c6b14b8887e" (UID: "710cab95-3f1a-48b7-b29b-1c6b14b8887e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:15:27.721856 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:27.721754 2575 generic.go:358] "Generic (PLEG): container finished" podID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerID="6c611fc419d18c9cd2b247777598e772080a56cdf488bca3a5f7c1a422b96479" exitCode=137 Apr 16 14:15:27.721856 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:27.721808 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" event={"ID":"710cab95-3f1a-48b7-b29b-1c6b14b8887e","Type":"ContainerDied","Data":"6c611fc419d18c9cd2b247777598e772080a56cdf488bca3a5f7c1a422b96479"} Apr 16 14:15:27.721856 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:27.721856 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" event={"ID":"710cab95-3f1a-48b7-b29b-1c6b14b8887e","Type":"ContainerDied","Data":"0ed937f0c3cadb403d1bc569934124203c44d7409d495627b096916c6c42494a"} Apr 16 14:15:27.722077 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:27.721876 2575 scope.go:117] "RemoveContainer" containerID="6c611fc419d18c9cd2b247777598e772080a56cdf488bca3a5f7c1a422b96479" Apr 16 14:15:27.722077 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:27.721884 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg" Apr 16 14:15:27.729252 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:27.729232 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/710cab95-3f1a-48b7-b29b-1c6b14b8887e-kserve-provision-location\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:15:27.729962 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:27.729942 2575 scope.go:117] "RemoveContainer" containerID="06623c3de46cabe12133c191353b472f4ba5a3154fdf17fc1abe6b236b7c474f" Apr 16 14:15:27.736930 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:27.736911 2575 scope.go:117] "RemoveContainer" containerID="0cbed02265e92915a0f3ae996ef9358b3b6a18a3c350197847e0e49e65069040" Apr 16 14:15:27.743422 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:27.743397 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg"] Apr 16 14:15:27.744592 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:27.744578 2575 scope.go:117] "RemoveContainer" containerID="6c611fc419d18c9cd2b247777598e772080a56cdf488bca3a5f7c1a422b96479" Apr 16 14:15:27.744863 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:15:27.744843 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c611fc419d18c9cd2b247777598e772080a56cdf488bca3a5f7c1a422b96479\": container with ID starting with 6c611fc419d18c9cd2b247777598e772080a56cdf488bca3a5f7c1a422b96479 not found: ID does not exist" containerID="6c611fc419d18c9cd2b247777598e772080a56cdf488bca3a5f7c1a422b96479" Apr 16 14:15:27.744977 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:27.744875 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c611fc419d18c9cd2b247777598e772080a56cdf488bca3a5f7c1a422b96479"} err="failed to get container status \"6c611fc419d18c9cd2b247777598e772080a56cdf488bca3a5f7c1a422b96479\": rpc error: code = NotFound desc = could not find container \"6c611fc419d18c9cd2b247777598e772080a56cdf488bca3a5f7c1a422b96479\": container with ID starting with 6c611fc419d18c9cd2b247777598e772080a56cdf488bca3a5f7c1a422b96479 not found: ID does not exist" Apr 16 14:15:27.744977 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:27.744900 2575 scope.go:117] "RemoveContainer" containerID="06623c3de46cabe12133c191353b472f4ba5a3154fdf17fc1abe6b236b7c474f" Apr 16 14:15:27.745205 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:15:27.745183 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06623c3de46cabe12133c191353b472f4ba5a3154fdf17fc1abe6b236b7c474f\": container with ID starting with 06623c3de46cabe12133c191353b472f4ba5a3154fdf17fc1abe6b236b7c474f not found: ID does not exist" containerID="06623c3de46cabe12133c191353b472f4ba5a3154fdf17fc1abe6b236b7c474f" Apr 16 14:15:27.745295 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:27.745215 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06623c3de46cabe12133c191353b472f4ba5a3154fdf17fc1abe6b236b7c474f"} err="failed to get container status \"06623c3de46cabe12133c191353b472f4ba5a3154fdf17fc1abe6b236b7c474f\": rpc error: code = NotFound desc = could not find container \"06623c3de46cabe12133c191353b472f4ba5a3154fdf17fc1abe6b236b7c474f\": container with ID starting with 06623c3de46cabe12133c191353b472f4ba5a3154fdf17fc1abe6b236b7c474f not found: ID does not exist" Apr 16 14:15:27.745295 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:27.745237 2575 scope.go:117] "RemoveContainer" containerID="0cbed02265e92915a0f3ae996ef9358b3b6a18a3c350197847e0e49e65069040" Apr 16 14:15:27.745545 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:15:27.745521 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cbed02265e92915a0f3ae996ef9358b3b6a18a3c350197847e0e49e65069040\": container with ID starting with 0cbed02265e92915a0f3ae996ef9358b3b6a18a3c350197847e0e49e65069040 not found: ID does not exist" containerID="0cbed02265e92915a0f3ae996ef9358b3b6a18a3c350197847e0e49e65069040" Apr 16 14:15:27.745642 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:27.745550 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cbed02265e92915a0f3ae996ef9358b3b6a18a3c350197847e0e49e65069040"} err="failed to get container status \"0cbed02265e92915a0f3ae996ef9358b3b6a18a3c350197847e0e49e65069040\": rpc error: code = NotFound desc = could not find container \"0cbed02265e92915a0f3ae996ef9358b3b6a18a3c350197847e0e49e65069040\": container with ID starting with 0cbed02265e92915a0f3ae996ef9358b3b6a18a3c350197847e0e49e65069040 not found: ID does not exist" Apr 16 14:15:27.751584 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:27.748062 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-7cc0c-predictor-7b494555d9-95ltg"] Apr 16 14:15:28.389711 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:28.389677 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" path="/var/lib/kubelet/pods/710cab95-3f1a-48b7-b29b-1c6b14b8887e/volumes" Apr 16 14:15:33.651409 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:33.651363 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" podUID="fc9c1d1d-5ded-4e1a-b096-56ce46b255e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 14:15:43.651656 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:43.651607 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" podUID="fc9c1d1d-5ded-4e1a-b096-56ce46b255e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 14:15:53.651392 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:15:53.651344 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" podUID="fc9c1d1d-5ded-4e1a-b096-56ce46b255e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 14:16:03.651209 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:16:03.651158 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" podUID="fc9c1d1d-5ded-4e1a-b096-56ce46b255e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 14:16:11.387000 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:16:11.386957 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" podUID="fc9c1d1d-5ded-4e1a-b096-56ce46b255e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 14:16:21.387578 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:16:21.387534 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" podUID="fc9c1d1d-5ded-4e1a-b096-56ce46b255e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 14:16:31.387529 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:16:31.387484 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" podUID="fc9c1d1d-5ded-4e1a-b096-56ce46b255e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 14:16:41.387517 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:16:41.387465 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" podUID="fc9c1d1d-5ded-4e1a-b096-56ce46b255e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 14:16:51.387408 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:16:51.387356 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" podUID="fc9c1d1d-5ded-4e1a-b096-56ce46b255e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 14:17:01.387215 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:01.387161 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" podUID="fc9c1d1d-5ded-4e1a-b096-56ce46b255e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 14:17:06.344810 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:06.344776 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zxb7l_24046d5b-b6df-4005-85d5-01cafc82cc40/console-operator/2.log" Apr 16 14:17:06.348540 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:06.348513 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zxb7l_24046d5b-b6df-4005-85d5-01cafc82cc40/console-operator/2.log" Apr 16 14:17:11.387814 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:11.387770 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" podUID="fc9c1d1d-5ded-4e1a-b096-56ce46b255e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 14:17:21.388238 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:21.388204 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" Apr 16 14:17:27.083166 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:27.083120 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv"] Apr 16 14:17:27.083670 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:27.083499 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" podUID="fc9c1d1d-5ded-4e1a-b096-56ce46b255e1" containerName="kserve-container" containerID="cri-o://1f16a3b398ba203534bca2ab2184fcc45d44046e7cbd21a529d1adee9ea9341d" gracePeriod=30 Apr 16 14:17:27.175202 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:27.175165 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z"] Apr 16 14:17:27.175567 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:27.175553 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="storage-initializer" Apr 16 14:17:27.175614 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:27.175569 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="storage-initializer" Apr 16 14:17:27.175614 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:27.175579 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="agent" Apr 16 14:17:27.175614 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:27.175585 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="agent" Apr 16 14:17:27.175614 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:27.175596 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="kserve-container" Apr 16 14:17:27.175614 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:27.175602 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="kserve-container" Apr 16 14:17:27.175775 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:27.175654 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="agent" Apr 16 14:17:27.175775 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:27.175665 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="710cab95-3f1a-48b7-b29b-1c6b14b8887e" containerName="kserve-container" Apr 16 14:17:27.178689 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:27.178672 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z" Apr 16 14:17:27.187449 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:27.187421 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z"] Apr 16 14:17:27.301014 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:27.300976 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04cd236f-53b4-4da3-90e7-c029b17aa260-kserve-provision-location\") pod \"isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z\" (UID: \"04cd236f-53b4-4da3-90e7-c029b17aa260\") " pod="kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z" Apr 16 14:17:27.402220 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:27.402177 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04cd236f-53b4-4da3-90e7-c029b17aa260-kserve-provision-location\") pod \"isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z\" (UID: \"04cd236f-53b4-4da3-90e7-c029b17aa260\") " pod="kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z" Apr 16 14:17:27.402601 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:27.402576 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04cd236f-53b4-4da3-90e7-c029b17aa260-kserve-provision-location\") pod \"isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z\" (UID: \"04cd236f-53b4-4da3-90e7-c029b17aa260\") " pod="kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z" Apr 16 14:17:27.489945 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:27.489911 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z" Apr 16 14:17:27.608227 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:27.608200 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z"] Apr 16 14:17:27.610953 ip-10-0-141-131 kubenswrapper[2575]: W0416 14:17:27.610926 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04cd236f_53b4_4da3_90e7_c029b17aa260.slice/crio-948a5a1ee80b8709f7c146e61976997ff316d8156570299020f277be0e3386b2 WatchSource:0}: Error finding container 948a5a1ee80b8709f7c146e61976997ff316d8156570299020f277be0e3386b2: Status 404 returned error can't find the container with id 948a5a1ee80b8709f7c146e61976997ff316d8156570299020f277be0e3386b2 Apr 16 14:17:28.080110 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:28.080018 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z" event={"ID":"04cd236f-53b4-4da3-90e7-c029b17aa260","Type":"ContainerStarted","Data":"84ffd9a35e0b1332a24ac1b4a6f8edda37117b566bd7fc8051eda1f7d53a0d63"} Apr 16 14:17:28.080110 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:28.080064 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z" event={"ID":"04cd236f-53b4-4da3-90e7-c029b17aa260","Type":"ContainerStarted","Data":"948a5a1ee80b8709f7c146e61976997ff316d8156570299020f277be0e3386b2"} Apr 16 14:17:31.387295 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:31.387252 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" podUID="fc9c1d1d-5ded-4e1a-b096-56ce46b255e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 14:17:32.092879 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:32.092844 2575 generic.go:358] "Generic (PLEG): container finished" podID="04cd236f-53b4-4da3-90e7-c029b17aa260" containerID="84ffd9a35e0b1332a24ac1b4a6f8edda37117b566bd7fc8051eda1f7d53a0d63" exitCode=0 Apr 16 14:17:32.092879 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:32.092886 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z" event={"ID":"04cd236f-53b4-4da3-90e7-c029b17aa260","Type":"ContainerDied","Data":"84ffd9a35e0b1332a24ac1b4a6f8edda37117b566bd7fc8051eda1f7d53a0d63"} Apr 16 14:17:33.097088 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:33.097051 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z" event={"ID":"04cd236f-53b4-4da3-90e7-c029b17aa260","Type":"ContainerStarted","Data":"c34a274ca436fe10b4b4cbcd2c1755fc8641c64c155b200ac2288434e139392f"} Apr 16 14:17:33.097525 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:33.097428 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z" Apr 16 14:17:33.098934 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:33.098904 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z" podUID="04cd236f-53b4-4da3-90e7-c029b17aa260" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 14:17:33.114764 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:33.114685 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z" podStartSLOduration=6.114665979 podStartE2EDuration="6.114665979s" podCreationTimestamp="2026-04-16 14:17:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:17:33.113914041 +0000 UTC m=+1227.292162910" watchObservedRunningTime="2026-04-16 14:17:33.114665979 +0000 UTC m=+1227.292914847" Apr 16 14:17:34.100419 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:34.100374 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z" podUID="04cd236f-53b4-4da3-90e7-c029b17aa260" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 14:17:36.522640 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:36.522615 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" Apr 16 14:17:36.583092 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:36.583056 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc9c1d1d-5ded-4e1a-b096-56ce46b255e1-kserve-provision-location\") pod \"fc9c1d1d-5ded-4e1a-b096-56ce46b255e1\" (UID: \"fc9c1d1d-5ded-4e1a-b096-56ce46b255e1\") " Apr 16 14:17:36.583404 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:36.583378 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc9c1d1d-5ded-4e1a-b096-56ce46b255e1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fc9c1d1d-5ded-4e1a-b096-56ce46b255e1" (UID: "fc9c1d1d-5ded-4e1a-b096-56ce46b255e1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:17:36.683574 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:36.683485 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc9c1d1d-5ded-4e1a-b096-56ce46b255e1-kserve-provision-location\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:17:37.112044 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:37.112012 2575 generic.go:358] "Generic (PLEG): container finished" podID="fc9c1d1d-5ded-4e1a-b096-56ce46b255e1" containerID="1f16a3b398ba203534bca2ab2184fcc45d44046e7cbd21a529d1adee9ea9341d" exitCode=0 Apr 16 14:17:37.112219 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:37.112085 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" Apr 16 14:17:37.112219 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:37.112104 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" event={"ID":"fc9c1d1d-5ded-4e1a-b096-56ce46b255e1","Type":"ContainerDied","Data":"1f16a3b398ba203534bca2ab2184fcc45d44046e7cbd21a529d1adee9ea9341d"} Apr 16 14:17:37.112219 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:37.112144 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv" event={"ID":"fc9c1d1d-5ded-4e1a-b096-56ce46b255e1","Type":"ContainerDied","Data":"1e48887b706dec5a8bfc74b6c630e720959d5f49fd40b7069f9fe6379ce676c8"} Apr 16 14:17:37.112219 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:37.112159 2575 scope.go:117] "RemoveContainer" containerID="1f16a3b398ba203534bca2ab2184fcc45d44046e7cbd21a529d1adee9ea9341d" Apr 16 14:17:37.120256 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:37.120231 2575 scope.go:117] "RemoveContainer" containerID="82ecb4ebb717d4dd3a38b4e9e9b2bcfedd1dfdc8a9ae085019d64942be2dda61" Apr 16 14:17:37.127916 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:37.127896 2575 scope.go:117] "RemoveContainer" containerID="1f16a3b398ba203534bca2ab2184fcc45d44046e7cbd21a529d1adee9ea9341d" Apr 16 14:17:37.128177 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:17:37.128152 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f16a3b398ba203534bca2ab2184fcc45d44046e7cbd21a529d1adee9ea9341d\": container with ID starting with 1f16a3b398ba203534bca2ab2184fcc45d44046e7cbd21a529d1adee9ea9341d not found: ID does not exist" containerID="1f16a3b398ba203534bca2ab2184fcc45d44046e7cbd21a529d1adee9ea9341d" Apr 16 14:17:37.128245 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:37.128186 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f16a3b398ba203534bca2ab2184fcc45d44046e7cbd21a529d1adee9ea9341d"} err="failed to get container status \"1f16a3b398ba203534bca2ab2184fcc45d44046e7cbd21a529d1adee9ea9341d\": rpc error: code = NotFound desc = could not find container \"1f16a3b398ba203534bca2ab2184fcc45d44046e7cbd21a529d1adee9ea9341d\": container with ID starting with 1f16a3b398ba203534bca2ab2184fcc45d44046e7cbd21a529d1adee9ea9341d not found: ID does not exist" Apr 16 14:17:37.128245 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:37.128204 2575 scope.go:117] "RemoveContainer" containerID="82ecb4ebb717d4dd3a38b4e9e9b2bcfedd1dfdc8a9ae085019d64942be2dda61" Apr 16 14:17:37.128440 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:17:37.128422 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82ecb4ebb717d4dd3a38b4e9e9b2bcfedd1dfdc8a9ae085019d64942be2dda61\": container with ID starting with 82ecb4ebb717d4dd3a38b4e9e9b2bcfedd1dfdc8a9ae085019d64942be2dda61 not found: ID does not exist" containerID="82ecb4ebb717d4dd3a38b4e9e9b2bcfedd1dfdc8a9ae085019d64942be2dda61" Apr 16 14:17:37.128488 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:37.128445 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82ecb4ebb717d4dd3a38b4e9e9b2bcfedd1dfdc8a9ae085019d64942be2dda61"} err="failed to get container status \"82ecb4ebb717d4dd3a38b4e9e9b2bcfedd1dfdc8a9ae085019d64942be2dda61\": rpc error: code = NotFound desc = could not find container \"82ecb4ebb717d4dd3a38b4e9e9b2bcfedd1dfdc8a9ae085019d64942be2dda61\": container with ID starting with 82ecb4ebb717d4dd3a38b4e9e9b2bcfedd1dfdc8a9ae085019d64942be2dda61 not found: ID does not exist" Apr 16 14:17:37.131839 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:37.131817 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv"] Apr 16 14:17:37.134160 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:37.134140 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-9b575-predictor-6fd5844c66-whqrv"] Apr 16 14:17:38.390550 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:38.390510 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc9c1d1d-5ded-4e1a-b096-56ce46b255e1" path="/var/lib/kubelet/pods/fc9c1d1d-5ded-4e1a-b096-56ce46b255e1/volumes" Apr 16 14:17:44.101421 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:44.101372 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z" podUID="04cd236f-53b4-4da3-90e7-c029b17aa260" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 14:17:54.100705 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:17:54.100659 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z" podUID="04cd236f-53b4-4da3-90e7-c029b17aa260" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 14:18:04.100820 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:04.100768 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z" podUID="04cd236f-53b4-4da3-90e7-c029b17aa260" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 14:18:06.380881 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:06.380846 2575 scope.go:117] "RemoveContainer" containerID="e605ee59964c227c21f2e05033c2c46defa1a6bc5a55162c5443f90f44093ec5" Apr 16 14:18:06.388309 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:06.388281 2575 scope.go:117] "RemoveContainer" containerID="eaa2bf15188c0012fa8e410b1da73731f73c06f808bbaf75d7c95077413db677" Apr 16 14:18:14.101138 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:14.101086 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z" podUID="04cd236f-53b4-4da3-90e7-c029b17aa260" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 14:18:24.100556 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:24.100507 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z" podUID="04cd236f-53b4-4da3-90e7-c029b17aa260" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 14:18:34.100417 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:34.100365 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z" podUID="04cd236f-53b4-4da3-90e7-c029b17aa260" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 14:18:44.102600 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:44.102567 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z" Apr 16 14:18:47.354305 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:47.354268 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw"] Apr 16 14:18:47.354680 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:47.354575 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc9c1d1d-5ded-4e1a-b096-56ce46b255e1" containerName="kserve-container" Apr 16 14:18:47.354680 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:47.354586 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc9c1d1d-5ded-4e1a-b096-56ce46b255e1" containerName="kserve-container" Apr 16 14:18:47.354680 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:47.354602 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc9c1d1d-5ded-4e1a-b096-56ce46b255e1" containerName="storage-initializer" Apr 16 14:18:47.354680 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:47.354608 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc9c1d1d-5ded-4e1a-b096-56ce46b255e1" containerName="storage-initializer" Apr 16 14:18:47.354680 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:47.354657 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="fc9c1d1d-5ded-4e1a-b096-56ce46b255e1" containerName="kserve-container" Apr 16 14:18:47.357707 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:47.357674 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw" Apr 16 14:18:47.360703 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:47.360669 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-0dd324\"" Apr 16 14:18:47.361068 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:47.361044 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 14:18:47.361541 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:47.361516 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-0dd324-dockercfg-56zw6\"" Apr 16 14:18:47.369988 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:47.369959 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw"] Apr 16 14:18:47.464558 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:47.464517 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e49926c-52a9-466a-8bb0-4d8e3872cf1c-kserve-provision-location\") pod \"isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw\" (UID: \"4e49926c-52a9-466a-8bb0-4d8e3872cf1c\") " pod="kserve-ci-e2e-test/isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw" Apr 16 14:18:47.464558 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:47.464564 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4e49926c-52a9-466a-8bb0-4d8e3872cf1c-cabundle-cert\") pod \"isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw\" (UID: \"4e49926c-52a9-466a-8bb0-4d8e3872cf1c\") " pod="kserve-ci-e2e-test/isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw" Apr 16 14:18:47.565944 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:47.565877 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e49926c-52a9-466a-8bb0-4d8e3872cf1c-kserve-provision-location\") pod \"isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw\" (UID: \"4e49926c-52a9-466a-8bb0-4d8e3872cf1c\") " pod="kserve-ci-e2e-test/isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw" Apr 16 14:18:47.565944 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:47.565954 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4e49926c-52a9-466a-8bb0-4d8e3872cf1c-cabundle-cert\") pod \"isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw\" (UID: \"4e49926c-52a9-466a-8bb0-4d8e3872cf1c\") " pod="kserve-ci-e2e-test/isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw" Apr 16 14:18:47.566289 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:47.566270 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e49926c-52a9-466a-8bb0-4d8e3872cf1c-kserve-provision-location\") pod \"isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw\" (UID: \"4e49926c-52a9-466a-8bb0-4d8e3872cf1c\") " pod="kserve-ci-e2e-test/isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw" Apr 16 14:18:47.566533 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:47.566512 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4e49926c-52a9-466a-8bb0-4d8e3872cf1c-cabundle-cert\") pod \"isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw\" (UID: \"4e49926c-52a9-466a-8bb0-4d8e3872cf1c\") " pod="kserve-ci-e2e-test/isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw" Apr 16 14:18:47.672989 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:47.672891 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw" Apr 16 14:18:47.794975 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:47.794947 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw"] Apr 16 14:18:47.797485 ip-10-0-141-131 kubenswrapper[2575]: W0416 14:18:47.797451 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e49926c_52a9_466a_8bb0_4d8e3872cf1c.slice/crio-5c112826776da5504d3b55d349603af3d7e197ae8eabce34086396ab915116b3 WatchSource:0}: Error finding container 5c112826776da5504d3b55d349603af3d7e197ae8eabce34086396ab915116b3: Status 404 returned error can't find the container with id 5c112826776da5504d3b55d349603af3d7e197ae8eabce34086396ab915116b3 Apr 16 14:18:47.799304 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:47.799287 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:18:48.333691 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:48.333656 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw" event={"ID":"4e49926c-52a9-466a-8bb0-4d8e3872cf1c","Type":"ContainerStarted","Data":"c38e36ed3a9d2b35d7513ab184a0c76e3bd105554705ce903a6125bf1d1ae3ea"} Apr 16 14:18:48.333691 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:48.333692 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw" event={"ID":"4e49926c-52a9-466a-8bb0-4d8e3872cf1c","Type":"ContainerStarted","Data":"5c112826776da5504d3b55d349603af3d7e197ae8eabce34086396ab915116b3"} Apr 16 14:18:52.346231 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:52.346201 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw_4e49926c-52a9-466a-8bb0-4d8e3872cf1c/storage-initializer/0.log" Apr 16 14:18:52.346663 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:52.346238 2575 generic.go:358] "Generic (PLEG): container finished" podID="4e49926c-52a9-466a-8bb0-4d8e3872cf1c" containerID="c38e36ed3a9d2b35d7513ab184a0c76e3bd105554705ce903a6125bf1d1ae3ea" exitCode=1 Apr 16 14:18:52.346663 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:52.346312 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw" event={"ID":"4e49926c-52a9-466a-8bb0-4d8e3872cf1c","Type":"ContainerDied","Data":"c38e36ed3a9d2b35d7513ab184a0c76e3bd105554705ce903a6125bf1d1ae3ea"} Apr 16 14:18:53.350781 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:53.350754 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw_4e49926c-52a9-466a-8bb0-4d8e3872cf1c/storage-initializer/0.log" Apr 16 14:18:53.351157 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:53.350870 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw" event={"ID":"4e49926c-52a9-466a-8bb0-4d8e3872cf1c","Type":"ContainerStarted","Data":"ca8375e69b44291283d668eab07859dfb31d30f821bfe88a25dfc358e6d03a75"} Apr 16 14:18:55.358042 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:55.358011 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw_4e49926c-52a9-466a-8bb0-4d8e3872cf1c/storage-initializer/1.log" Apr 16 14:18:55.358470 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:55.358391 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw_4e49926c-52a9-466a-8bb0-4d8e3872cf1c/storage-initializer/0.log" Apr 16 14:18:55.358470 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:55.358429 2575 generic.go:358] "Generic (PLEG): container finished" podID="4e49926c-52a9-466a-8bb0-4d8e3872cf1c" containerID="ca8375e69b44291283d668eab07859dfb31d30f821bfe88a25dfc358e6d03a75" exitCode=1 Apr 16 14:18:55.358571 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:55.358504 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw" event={"ID":"4e49926c-52a9-466a-8bb0-4d8e3872cf1c","Type":"ContainerDied","Data":"ca8375e69b44291283d668eab07859dfb31d30f821bfe88a25dfc358e6d03a75"} Apr 16 14:18:55.358571 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:55.358546 2575 scope.go:117] "RemoveContainer" containerID="c38e36ed3a9d2b35d7513ab184a0c76e3bd105554705ce903a6125bf1d1ae3ea" Apr 16 14:18:55.358900 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:55.358885 2575 scope.go:117] "RemoveContainer" containerID="c38e36ed3a9d2b35d7513ab184a0c76e3bd105554705ce903a6125bf1d1ae3ea" Apr 16 14:18:55.368852 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:18:55.368819 2575 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw_kserve-ci-e2e-test_4e49926c-52a9-466a-8bb0-4d8e3872cf1c_0 in pod sandbox 5c112826776da5504d3b55d349603af3d7e197ae8eabce34086396ab915116b3 from index: no such id: 'c38e36ed3a9d2b35d7513ab184a0c76e3bd105554705ce903a6125bf1d1ae3ea'" containerID="c38e36ed3a9d2b35d7513ab184a0c76e3bd105554705ce903a6125bf1d1ae3ea" Apr 16 14:18:55.368941 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:18:55.368878 2575 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw_kserve-ci-e2e-test_4e49926c-52a9-466a-8bb0-4d8e3872cf1c_0 in pod sandbox 5c112826776da5504d3b55d349603af3d7e197ae8eabce34086396ab915116b3 from index: no such id: 'c38e36ed3a9d2b35d7513ab184a0c76e3bd105554705ce903a6125bf1d1ae3ea'; Skipping pod \"isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw_kserve-ci-e2e-test(4e49926c-52a9-466a-8bb0-4d8e3872cf1c)\"" logger="UnhandledError" Apr 16 14:18:55.370227 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:18:55.370200 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw_kserve-ci-e2e-test(4e49926c-52a9-466a-8bb0-4d8e3872cf1c)\"" pod="kserve-ci-e2e-test/isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw" podUID="4e49926c-52a9-466a-8bb0-4d8e3872cf1c" Apr 16 14:18:56.362557 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:18:56.362527 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw_4e49926c-52a9-466a-8bb0-4d8e3872cf1c/storage-initializer/1.log" Apr 16 14:19:03.405866 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:03.405831 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw"] Apr 16 14:19:03.459382 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:03.459347 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z"] Apr 16 14:19:03.459650 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:03.459623 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z" podUID="04cd236f-53b4-4da3-90e7-c029b17aa260" containerName="kserve-container" containerID="cri-o://c34a274ca436fe10b4b4cbcd2c1755fc8641c64c155b200ac2288434e139392f" gracePeriod=30 Apr 16 14:19:03.516654 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:03.516619 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn"] Apr 16 14:19:03.518885 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:03.518863 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn" Apr 16 14:19:03.521386 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:03.521363 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-9f5836-dockercfg-2qfk4\"" Apr 16 14:19:03.521508 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:03.521367 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-9f5836\"" Apr 16 14:19:03.530700 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:03.530672 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn"] Apr 16 14:19:03.549168 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:03.549142 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw_4e49926c-52a9-466a-8bb0-4d8e3872cf1c/storage-initializer/1.log" Apr 16 14:19:03.549309 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:03.549216 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw" Apr 16 14:19:03.598665 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:03.598626 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ea895d81-0422-4054-9fb6-b364dfc42f26-cabundle-cert\") pod \"isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn\" (UID: \"ea895d81-0422-4054-9fb6-b364dfc42f26\") " pod="kserve-ci-e2e-test/isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn" Apr 16 14:19:03.598860 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:03.598693 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea895d81-0422-4054-9fb6-b364dfc42f26-kserve-provision-location\") pod \"isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn\" (UID: \"ea895d81-0422-4054-9fb6-b364dfc42f26\") " pod="kserve-ci-e2e-test/isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn" Apr 16 14:19:03.699963 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:03.699874 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4e49926c-52a9-466a-8bb0-4d8e3872cf1c-cabundle-cert\") pod \"4e49926c-52a9-466a-8bb0-4d8e3872cf1c\" (UID: \"4e49926c-52a9-466a-8bb0-4d8e3872cf1c\") " Apr 16 14:19:03.699963 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:03.699929 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e49926c-52a9-466a-8bb0-4d8e3872cf1c-kserve-provision-location\") pod \"4e49926c-52a9-466a-8bb0-4d8e3872cf1c\" (UID: \"4e49926c-52a9-466a-8bb0-4d8e3872cf1c\") " Apr 16 14:19:03.700141 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:03.700040 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea895d81-0422-4054-9fb6-b364dfc42f26-kserve-provision-location\") pod \"isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn\" (UID: \"ea895d81-0422-4054-9fb6-b364dfc42f26\") " pod="kserve-ci-e2e-test/isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn" Apr 16 14:19:03.700141 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:03.700109 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ea895d81-0422-4054-9fb6-b364dfc42f26-cabundle-cert\") pod \"isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn\" (UID: \"ea895d81-0422-4054-9fb6-b364dfc42f26\") " pod="kserve-ci-e2e-test/isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn" Apr 16 14:19:03.700224 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:03.700199 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e49926c-52a9-466a-8bb0-4d8e3872cf1c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4e49926c-52a9-466a-8bb0-4d8e3872cf1c" (UID: "4e49926c-52a9-466a-8bb0-4d8e3872cf1c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:19:03.700298 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:03.700274 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e49926c-52a9-466a-8bb0-4d8e3872cf1c-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "4e49926c-52a9-466a-8bb0-4d8e3872cf1c" (UID: "4e49926c-52a9-466a-8bb0-4d8e3872cf1c"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:19:03.700419 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:03.700401 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea895d81-0422-4054-9fb6-b364dfc42f26-kserve-provision-location\") pod \"isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn\" (UID: \"ea895d81-0422-4054-9fb6-b364dfc42f26\") " pod="kserve-ci-e2e-test/isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn" Apr 16 14:19:03.700661 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:03.700645 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ea895d81-0422-4054-9fb6-b364dfc42f26-cabundle-cert\") pod \"isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn\" (UID: \"ea895d81-0422-4054-9fb6-b364dfc42f26\") " pod="kserve-ci-e2e-test/isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn" Apr 16 14:19:03.801353 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:03.801318 2575 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4e49926c-52a9-466a-8bb0-4d8e3872cf1c-cabundle-cert\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:19:03.801353 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:03.801354 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e49926c-52a9-466a-8bb0-4d8e3872cf1c-kserve-provision-location\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:19:03.832750 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:03.832692 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn" Apr 16 14:19:03.956106 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:03.956025 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn"] Apr 16 14:19:03.959172 ip-10-0-141-131 kubenswrapper[2575]: W0416 14:19:03.959143 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea895d81_0422_4054_9fb6_b364dfc42f26.slice/crio-78291b2aa36ce26d071c7b4a8f494629ccaa43c57b939ee20857743fbdedcd54 WatchSource:0}: Error finding container 78291b2aa36ce26d071c7b4a8f494629ccaa43c57b939ee20857743fbdedcd54: Status 404 returned error can't find the container with id 78291b2aa36ce26d071c7b4a8f494629ccaa43c57b939ee20857743fbdedcd54 Apr 16 14:19:04.100967 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:04.100920 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z" podUID="04cd236f-53b4-4da3-90e7-c029b17aa260" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 14:19:04.394135 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:04.390388 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw_4e49926c-52a9-466a-8bb0-4d8e3872cf1c/storage-initializer/1.log" Apr 16 14:19:04.394135 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:04.390621 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw" Apr 16 14:19:04.398900 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:04.398862 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn" event={"ID":"ea895d81-0422-4054-9fb6-b364dfc42f26","Type":"ContainerStarted","Data":"a8bd12cae068b630a3688b76cf91ab52c06d6ae92df0ba1feb9a5fea282901bf"} Apr 16 14:19:04.398900 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:04.398906 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn" event={"ID":"ea895d81-0422-4054-9fb6-b364dfc42f26","Type":"ContainerStarted","Data":"78291b2aa36ce26d071c7b4a8f494629ccaa43c57b939ee20857743fbdedcd54"} Apr 16 14:19:04.399103 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:04.398921 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw" event={"ID":"4e49926c-52a9-466a-8bb0-4d8e3872cf1c","Type":"ContainerDied","Data":"5c112826776da5504d3b55d349603af3d7e197ae8eabce34086396ab915116b3"} Apr 16 14:19:04.399103 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:04.398946 2575 scope.go:117] "RemoveContainer" containerID="ca8375e69b44291283d668eab07859dfb31d30f821bfe88a25dfc358e6d03a75" Apr 16 14:19:04.435418 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:04.435381 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw"] Apr 16 14:19:04.439314 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:04.439283 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-0dd324-predictor-66c78d4b56-8zdcw"] Apr 16 14:19:06.391271 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:06.391239 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e49926c-52a9-466a-8bb0-4d8e3872cf1c" path="/var/lib/kubelet/pods/4e49926c-52a9-466a-8bb0-4d8e3872cf1c/volumes" Apr 16 14:19:07.902515 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:07.902453 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z" Apr 16 14:19:08.038365 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:08.038262 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04cd236f-53b4-4da3-90e7-c029b17aa260-kserve-provision-location\") pod \"04cd236f-53b4-4da3-90e7-c029b17aa260\" (UID: \"04cd236f-53b4-4da3-90e7-c029b17aa260\") " Apr 16 14:19:08.038604 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:08.038579 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04cd236f-53b4-4da3-90e7-c029b17aa260-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "04cd236f-53b4-4da3-90e7-c029b17aa260" (UID: "04cd236f-53b4-4da3-90e7-c029b17aa260"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:19:08.138989 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:08.138952 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04cd236f-53b4-4da3-90e7-c029b17aa260-kserve-provision-location\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:19:08.404664 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:08.404631 2575 generic.go:358] "Generic (PLEG): container finished" podID="04cd236f-53b4-4da3-90e7-c029b17aa260" containerID="c34a274ca436fe10b4b4cbcd2c1755fc8641c64c155b200ac2288434e139392f" exitCode=0 Apr 16 14:19:08.404836 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:08.404702 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z" Apr 16 14:19:08.404836 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:08.404716 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z" event={"ID":"04cd236f-53b4-4da3-90e7-c029b17aa260","Type":"ContainerDied","Data":"c34a274ca436fe10b4b4cbcd2c1755fc8641c64c155b200ac2288434e139392f"} Apr 16 14:19:08.404836 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:08.404776 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z" event={"ID":"04cd236f-53b4-4da3-90e7-c029b17aa260","Type":"ContainerDied","Data":"948a5a1ee80b8709f7c146e61976997ff316d8156570299020f277be0e3386b2"} Apr 16 14:19:08.404836 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:08.404797 2575 scope.go:117] "RemoveContainer" containerID="c34a274ca436fe10b4b4cbcd2c1755fc8641c64c155b200ac2288434e139392f" Apr 16 14:19:08.406283 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:08.406262 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn_ea895d81-0422-4054-9fb6-b364dfc42f26/storage-initializer/0.log" Apr 16 14:19:08.406385 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:08.406303 2575 generic.go:358] "Generic (PLEG): container finished" podID="ea895d81-0422-4054-9fb6-b364dfc42f26" containerID="a8bd12cae068b630a3688b76cf91ab52c06d6ae92df0ba1feb9a5fea282901bf" exitCode=1 Apr 16 14:19:08.406385 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:08.406372 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn" event={"ID":"ea895d81-0422-4054-9fb6-b364dfc42f26","Type":"ContainerDied","Data":"a8bd12cae068b630a3688b76cf91ab52c06d6ae92df0ba1feb9a5fea282901bf"} Apr 16 14:19:08.413042 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:08.412931 2575 scope.go:117] "RemoveContainer" containerID="84ffd9a35e0b1332a24ac1b4a6f8edda37117b566bd7fc8051eda1f7d53a0d63" Apr 16 14:19:08.420178 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:08.420159 2575 scope.go:117] "RemoveContainer" containerID="c34a274ca436fe10b4b4cbcd2c1755fc8641c64c155b200ac2288434e139392f" Apr 16 14:19:08.421097 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:19:08.421059 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c34a274ca436fe10b4b4cbcd2c1755fc8641c64c155b200ac2288434e139392f\": container with ID starting with c34a274ca436fe10b4b4cbcd2c1755fc8641c64c155b200ac2288434e139392f not found: ID does not exist" containerID="c34a274ca436fe10b4b4cbcd2c1755fc8641c64c155b200ac2288434e139392f" Apr 16 14:19:08.421209 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:08.421105 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c34a274ca436fe10b4b4cbcd2c1755fc8641c64c155b200ac2288434e139392f"} err="failed to get container status \"c34a274ca436fe10b4b4cbcd2c1755fc8641c64c155b200ac2288434e139392f\": rpc error: code = NotFound desc = could not find container \"c34a274ca436fe10b4b4cbcd2c1755fc8641c64c155b200ac2288434e139392f\": container with ID starting with c34a274ca436fe10b4b4cbcd2c1755fc8641c64c155b200ac2288434e139392f not found: ID does not exist" Apr 16 14:19:08.421209 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:08.421127 2575 scope.go:117] "RemoveContainer" containerID="84ffd9a35e0b1332a24ac1b4a6f8edda37117b566bd7fc8051eda1f7d53a0d63" Apr 16 14:19:08.421503 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:19:08.421475 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84ffd9a35e0b1332a24ac1b4a6f8edda37117b566bd7fc8051eda1f7d53a0d63\": container with ID starting with 84ffd9a35e0b1332a24ac1b4a6f8edda37117b566bd7fc8051eda1f7d53a0d63 not found: ID does not exist" containerID="84ffd9a35e0b1332a24ac1b4a6f8edda37117b566bd7fc8051eda1f7d53a0d63" Apr 16 14:19:08.421589 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:08.421514 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84ffd9a35e0b1332a24ac1b4a6f8edda37117b566bd7fc8051eda1f7d53a0d63"} err="failed to get container status \"84ffd9a35e0b1332a24ac1b4a6f8edda37117b566bd7fc8051eda1f7d53a0d63\": rpc error: code = NotFound desc = could not find container \"84ffd9a35e0b1332a24ac1b4a6f8edda37117b566bd7fc8051eda1f7d53a0d63\": container with ID starting with 84ffd9a35e0b1332a24ac1b4a6f8edda37117b566bd7fc8051eda1f7d53a0d63 not found: ID does not exist" Apr 16 14:19:08.422821 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:08.422796 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z"] Apr 16 14:19:08.427785 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:08.427748 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-0dd324-predictor-5f56f6dd6f-wvg7z"] Apr 16 14:19:09.411912 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:09.411880 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn_ea895d81-0422-4054-9fb6-b364dfc42f26/storage-initializer/0.log" Apr 16 14:19:09.412368 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:09.411929 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn" event={"ID":"ea895d81-0422-4054-9fb6-b364dfc42f26","Type":"ContainerStarted","Data":"d2fa60ac300bdbba866ae51dc2295a3b19f34598a1bcf771f217def1574d8b06"} Apr 16 14:19:10.390439 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:10.390404 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04cd236f-53b4-4da3-90e7-c029b17aa260" path="/var/lib/kubelet/pods/04cd236f-53b4-4da3-90e7-c029b17aa260/volumes" Apr 16 14:19:13.425513 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:13.425487 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn_ea895d81-0422-4054-9fb6-b364dfc42f26/storage-initializer/1.log" Apr 16 14:19:13.425951 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:13.425853 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn_ea895d81-0422-4054-9fb6-b364dfc42f26/storage-initializer/0.log" Apr 16 14:19:13.425951 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:13.425881 2575 generic.go:358] "Generic (PLEG): container finished" podID="ea895d81-0422-4054-9fb6-b364dfc42f26" containerID="d2fa60ac300bdbba866ae51dc2295a3b19f34598a1bcf771f217def1574d8b06" exitCode=1 Apr 16 14:19:13.425951 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:13.425935 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn" event={"ID":"ea895d81-0422-4054-9fb6-b364dfc42f26","Type":"ContainerDied","Data":"d2fa60ac300bdbba866ae51dc2295a3b19f34598a1bcf771f217def1574d8b06"} Apr 16 14:19:13.426073 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:13.425967 2575 scope.go:117] "RemoveContainer" containerID="a8bd12cae068b630a3688b76cf91ab52c06d6ae92df0ba1feb9a5fea282901bf" Apr 16 14:19:13.426326 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:13.426306 2575 scope.go:117] "RemoveContainer" containerID="a8bd12cae068b630a3688b76cf91ab52c06d6ae92df0ba1feb9a5fea282901bf" Apr 16 14:19:13.435454 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:19:13.435415 2575 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn_kserve-ci-e2e-test_ea895d81-0422-4054-9fb6-b364dfc42f26_0 in pod sandbox 78291b2aa36ce26d071c7b4a8f494629ccaa43c57b939ee20857743fbdedcd54 from index: no such id: 'a8bd12cae068b630a3688b76cf91ab52c06d6ae92df0ba1feb9a5fea282901bf'" containerID="a8bd12cae068b630a3688b76cf91ab52c06d6ae92df0ba1feb9a5fea282901bf" Apr 16 14:19:13.435561 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:19:13.435464 2575 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn_kserve-ci-e2e-test_ea895d81-0422-4054-9fb6-b364dfc42f26_0 in pod sandbox 78291b2aa36ce26d071c7b4a8f494629ccaa43c57b939ee20857743fbdedcd54 from index: no such id: 'a8bd12cae068b630a3688b76cf91ab52c06d6ae92df0ba1feb9a5fea282901bf'; Skipping pod \"isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn_kserve-ci-e2e-test(ea895d81-0422-4054-9fb6-b364dfc42f26)\"" logger="UnhandledError" Apr 16 14:19:13.437035 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:19:13.437013 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn_kserve-ci-e2e-test(ea895d81-0422-4054-9fb6-b364dfc42f26)\"" pod="kserve-ci-e2e-test/isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn" podUID="ea895d81-0422-4054-9fb6-b364dfc42f26" Apr 16 14:19:13.524139 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:13.524104 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn"] Apr 16 14:19:13.637541 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:13.637509 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr"] Apr 16 14:19:13.637868 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:13.637853 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e49926c-52a9-466a-8bb0-4d8e3872cf1c" containerName="storage-initializer" Apr 16 14:19:13.637942 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:13.637871 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e49926c-52a9-466a-8bb0-4d8e3872cf1c" containerName="storage-initializer" Apr 16 14:19:13.637942 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:13.637891 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04cd236f-53b4-4da3-90e7-c029b17aa260" containerName="kserve-container" Apr 16 14:19:13.637942 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:13.637899 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="04cd236f-53b4-4da3-90e7-c029b17aa260" containerName="kserve-container" Apr 16 14:19:13.637942 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:13.637913 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e49926c-52a9-466a-8bb0-4d8e3872cf1c" containerName="storage-initializer" Apr 16 14:19:13.637942 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:13.637919 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e49926c-52a9-466a-8bb0-4d8e3872cf1c" containerName="storage-initializer" Apr 16 14:19:13.637942 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:13.637933 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04cd236f-53b4-4da3-90e7-c029b17aa260" containerName="storage-initializer" Apr 16 14:19:13.637942 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:13.637941 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="04cd236f-53b4-4da3-90e7-c029b17aa260" containerName="storage-initializer" Apr 16 14:19:13.638286 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:13.638003 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e49926c-52a9-466a-8bb0-4d8e3872cf1c" containerName="storage-initializer" Apr 16 14:19:13.638286 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:13.638014 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e49926c-52a9-466a-8bb0-4d8e3872cf1c" containerName="storage-initializer" Apr 16 14:19:13.638286 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:13.638026 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="04cd236f-53b4-4da3-90e7-c029b17aa260" containerName="kserve-container" Apr 16 14:19:13.640963 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:13.640928 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr" Apr 16 14:19:13.644876 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:13.643404 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-9pgb7\"" Apr 16 14:19:13.650456 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:13.650430 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr"] Apr 16 14:19:13.791465 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:13.791377 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f-kserve-provision-location\") pod \"raw-sklearn-eca71-predictor-df48fb76-48pcr\" (UID: \"3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f\") " pod="kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr" Apr 16 14:19:13.892860 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:13.892802 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f-kserve-provision-location\") pod \"raw-sklearn-eca71-predictor-df48fb76-48pcr\" (UID: \"3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f\") " pod="kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr" Apr 16 14:19:13.893218 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:13.893196 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f-kserve-provision-location\") pod \"raw-sklearn-eca71-predictor-df48fb76-48pcr\" (UID: \"3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f\") " pod="kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr" Apr 16 14:19:13.955704 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:13.955674 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr" Apr 16 14:19:14.076539 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:14.076511 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr"] Apr 16 14:19:14.079132 ip-10-0-141-131 kubenswrapper[2575]: W0416 14:19:14.079103 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a0cbfcd_168e_433a_9b1b_c7d1338f9d9f.slice/crio-14b5f7d692668da4b631e0f989fce51ba3618325c140338617ad86dc67f4ecb0 WatchSource:0}: Error finding container 14b5f7d692668da4b631e0f989fce51ba3618325c140338617ad86dc67f4ecb0: Status 404 returned error can't find the container with id 14b5f7d692668da4b631e0f989fce51ba3618325c140338617ad86dc67f4ecb0 Apr 16 14:19:14.430259 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:14.430231 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn_ea895d81-0422-4054-9fb6-b364dfc42f26/storage-initializer/1.log" Apr 16 14:19:14.432069 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:14.432026 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr" event={"ID":"3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f","Type":"ContainerStarted","Data":"c491ca09c9ace21d95f593e0c97010d91fb7dcf70204432aab10c96c1c25a91b"} Apr 16 14:19:14.432197 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:14.432070 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr" event={"ID":"3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f","Type":"ContainerStarted","Data":"14b5f7d692668da4b631e0f989fce51ba3618325c140338617ad86dc67f4ecb0"} Apr 16 14:19:14.559605 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:14.559582 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn_ea895d81-0422-4054-9fb6-b364dfc42f26/storage-initializer/1.log" Apr 16 14:19:14.559716 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:14.559645 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn" Apr 16 14:19:14.700418 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:14.700330 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea895d81-0422-4054-9fb6-b364dfc42f26-kserve-provision-location\") pod \"ea895d81-0422-4054-9fb6-b364dfc42f26\" (UID: \"ea895d81-0422-4054-9fb6-b364dfc42f26\") " Apr 16 14:19:14.700418 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:14.700387 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ea895d81-0422-4054-9fb6-b364dfc42f26-cabundle-cert\") pod \"ea895d81-0422-4054-9fb6-b364dfc42f26\" (UID: \"ea895d81-0422-4054-9fb6-b364dfc42f26\") " Apr 16 14:19:14.700625 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:14.700594 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea895d81-0422-4054-9fb6-b364dfc42f26-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ea895d81-0422-4054-9fb6-b364dfc42f26" (UID: "ea895d81-0422-4054-9fb6-b364dfc42f26"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:19:14.700767 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:14.700749 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea895d81-0422-4054-9fb6-b364dfc42f26-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "ea895d81-0422-4054-9fb6-b364dfc42f26" (UID: "ea895d81-0422-4054-9fb6-b364dfc42f26"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:19:14.801122 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:14.801085 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea895d81-0422-4054-9fb6-b364dfc42f26-kserve-provision-location\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:19:14.801122 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:14.801119 2575 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ea895d81-0422-4054-9fb6-b364dfc42f26-cabundle-cert\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:19:15.435879 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:15.435852 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn_ea895d81-0422-4054-9fb6-b364dfc42f26/storage-initializer/1.log" Apr 16 14:19:15.436293 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:15.435973 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn" Apr 16 14:19:15.436293 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:15.435983 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn" event={"ID":"ea895d81-0422-4054-9fb6-b364dfc42f26","Type":"ContainerDied","Data":"78291b2aa36ce26d071c7b4a8f494629ccaa43c57b939ee20857743fbdedcd54"} Apr 16 14:19:15.436293 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:15.436027 2575 scope.go:117] "RemoveContainer" containerID="d2fa60ac300bdbba866ae51dc2295a3b19f34598a1bcf771f217def1574d8b06" Apr 16 14:19:15.468621 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:15.468587 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn"] Apr 16 14:19:15.472392 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:15.472366 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-9f5836-predictor-585f8bcf78-z24bn"] Apr 16 14:19:16.390353 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:16.390316 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea895d81-0422-4054-9fb6-b364dfc42f26" path="/var/lib/kubelet/pods/ea895d81-0422-4054-9fb6-b364dfc42f26/volumes" Apr 16 14:19:18.447154 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:18.447123 2575 generic.go:358] "Generic (PLEG): container finished" podID="3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f" containerID="c491ca09c9ace21d95f593e0c97010d91fb7dcf70204432aab10c96c1c25a91b" exitCode=0 Apr 16 14:19:18.447551 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:18.447210 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr" event={"ID":"3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f","Type":"ContainerDied","Data":"c491ca09c9ace21d95f593e0c97010d91fb7dcf70204432aab10c96c1c25a91b"} Apr 16 14:19:19.451735 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:19.451699 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr" event={"ID":"3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f","Type":"ContainerStarted","Data":"ec3aa88a9c7244eb704d9c325e801c97022b3e514288ee9e6b82496dd17f29a7"} Apr 16 14:19:19.452140 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:19.452097 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr" Apr 16 14:19:19.453334 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:19.453308 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr" podUID="3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 14:19:19.467941 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:19.467879 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr" podStartSLOduration=6.467860278 podStartE2EDuration="6.467860278s" podCreationTimestamp="2026-04-16 14:19:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:19:19.467756326 +0000 UTC m=+1333.646005192" watchObservedRunningTime="2026-04-16 14:19:19.467860278 +0000 UTC m=+1333.646109147" Apr 16 14:19:20.455258 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:20.455214 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr" podUID="3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 14:19:30.455349 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:30.455295 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr" podUID="3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 14:19:40.455987 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:40.455940 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr" podUID="3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 14:19:50.455752 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:19:50.455680 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr" podUID="3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 14:20:00.456037 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:00.455980 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr" podUID="3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 14:20:10.455602 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:10.455544 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr" podUID="3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 14:20:20.455391 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:20.455332 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr" podUID="3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 14:20:22.387153 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:22.387104 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr" podUID="3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 14:20:32.390703 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:32.390672 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr" Apr 16 14:20:33.768009 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:33.767975 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr"] Apr 16 14:20:33.768466 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:33.768254 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr" podUID="3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f" containerName="kserve-container" containerID="cri-o://ec3aa88a9c7244eb704d9c325e801c97022b3e514288ee9e6b82496dd17f29a7" gracePeriod=30 Apr 16 14:20:33.839512 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:33.839473 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889"] Apr 16 14:20:33.839850 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:33.839835 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea895d81-0422-4054-9fb6-b364dfc42f26" containerName="storage-initializer" Apr 16 14:20:33.839916 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:33.839851 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea895d81-0422-4054-9fb6-b364dfc42f26" containerName="storage-initializer" Apr 16 14:20:33.839916 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:33.839873 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea895d81-0422-4054-9fb6-b364dfc42f26" containerName="storage-initializer" Apr 16 14:20:33.839916 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:33.839878 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea895d81-0422-4054-9fb6-b364dfc42f26" containerName="storage-initializer" Apr 16 14:20:33.840063 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:33.839925 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ea895d81-0422-4054-9fb6-b364dfc42f26" containerName="storage-initializer" Apr 16 14:20:33.840063 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:33.839936 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ea895d81-0422-4054-9fb6-b364dfc42f26" containerName="storage-initializer" Apr 16 14:20:33.842920 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:33.842900 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889" Apr 16 14:20:33.853451 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:33.853420 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889"] Apr 16 14:20:33.871035 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:33.870993 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d1fcbff-1731-4fef-a12c-01571e049a76-kserve-provision-location\") pod \"raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889\" (UID: \"5d1fcbff-1731-4fef-a12c-01571e049a76\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889" Apr 16 14:20:33.971902 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:33.971850 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d1fcbff-1731-4fef-a12c-01571e049a76-kserve-provision-location\") pod \"raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889\" (UID: \"5d1fcbff-1731-4fef-a12c-01571e049a76\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889" Apr 16 14:20:33.972291 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:33.972271 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d1fcbff-1731-4fef-a12c-01571e049a76-kserve-provision-location\") pod \"raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889\" (UID: \"5d1fcbff-1731-4fef-a12c-01571e049a76\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889" Apr 16 14:20:34.154699 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:34.154645 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889" Apr 16 14:20:34.281867 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:34.281830 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889"] Apr 16 14:20:34.284572 ip-10-0-141-131 kubenswrapper[2575]: W0416 14:20:34.284539 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d1fcbff_1731_4fef_a12c_01571e049a76.slice/crio-e95e365c1c5bc078739fcc67877141b4d44f507a7e2f72d7e1fe3dc0731c4434 WatchSource:0}: Error finding container e95e365c1c5bc078739fcc67877141b4d44f507a7e2f72d7e1fe3dc0731c4434: Status 404 returned error can't find the container with id e95e365c1c5bc078739fcc67877141b4d44f507a7e2f72d7e1fe3dc0731c4434 Apr 16 14:20:34.696904 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:34.696861 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889" event={"ID":"5d1fcbff-1731-4fef-a12c-01571e049a76","Type":"ContainerStarted","Data":"ca7887ae697a436950ab36b9dbc6734a9fd238225acbf21d72ab4ddcad4432c0"} Apr 16 14:20:34.697102 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:34.696913 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889" event={"ID":"5d1fcbff-1731-4fef-a12c-01571e049a76","Type":"ContainerStarted","Data":"e95e365c1c5bc078739fcc67877141b4d44f507a7e2f72d7e1fe3dc0731c4434"} Apr 16 14:20:38.406228 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:38.406201 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr" Apr 16 14:20:38.508529 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:38.508492 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f-kserve-provision-location\") pod \"3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f\" (UID: \"3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f\") " Apr 16 14:20:38.508873 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:38.508846 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f" (UID: "3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:20:38.609519 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:38.609470 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f-kserve-provision-location\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:20:38.710381 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:38.710296 2575 generic.go:358] "Generic (PLEG): container finished" podID="5d1fcbff-1731-4fef-a12c-01571e049a76" containerID="ca7887ae697a436950ab36b9dbc6734a9fd238225acbf21d72ab4ddcad4432c0" exitCode=0 Apr 16 14:20:38.710381 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:38.710368 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889" event={"ID":"5d1fcbff-1731-4fef-a12c-01571e049a76","Type":"ContainerDied","Data":"ca7887ae697a436950ab36b9dbc6734a9fd238225acbf21d72ab4ddcad4432c0"} Apr 16 14:20:38.711707 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:38.711687 2575 generic.go:358] "Generic (PLEG): container finished" podID="3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f" containerID="ec3aa88a9c7244eb704d9c325e801c97022b3e514288ee9e6b82496dd17f29a7" exitCode=0 Apr 16 14:20:38.711829 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:38.711718 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr" event={"ID":"3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f","Type":"ContainerDied","Data":"ec3aa88a9c7244eb704d9c325e801c97022b3e514288ee9e6b82496dd17f29a7"} Apr 16 14:20:38.711829 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:38.711758 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr" event={"ID":"3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f","Type":"ContainerDied","Data":"14b5f7d692668da4b631e0f989fce51ba3618325c140338617ad86dc67f4ecb0"} Apr 16 14:20:38.711829 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:38.711758 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr" Apr 16 14:20:38.711829 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:38.711770 2575 scope.go:117] "RemoveContainer" containerID="ec3aa88a9c7244eb704d9c325e801c97022b3e514288ee9e6b82496dd17f29a7" Apr 16 14:20:38.720514 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:38.720494 2575 scope.go:117] "RemoveContainer" containerID="c491ca09c9ace21d95f593e0c97010d91fb7dcf70204432aab10c96c1c25a91b" Apr 16 14:20:38.728390 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:38.728366 2575 scope.go:117] "RemoveContainer" containerID="ec3aa88a9c7244eb704d9c325e801c97022b3e514288ee9e6b82496dd17f29a7" Apr 16 14:20:38.728662 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:20:38.728643 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec3aa88a9c7244eb704d9c325e801c97022b3e514288ee9e6b82496dd17f29a7\": container with ID starting with ec3aa88a9c7244eb704d9c325e801c97022b3e514288ee9e6b82496dd17f29a7 not found: ID does not exist" containerID="ec3aa88a9c7244eb704d9c325e801c97022b3e514288ee9e6b82496dd17f29a7" Apr 16 14:20:38.728718 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:38.728671 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec3aa88a9c7244eb704d9c325e801c97022b3e514288ee9e6b82496dd17f29a7"} err="failed to get container status \"ec3aa88a9c7244eb704d9c325e801c97022b3e514288ee9e6b82496dd17f29a7\": rpc error: code = NotFound desc = could not find container \"ec3aa88a9c7244eb704d9c325e801c97022b3e514288ee9e6b82496dd17f29a7\": container with ID starting with ec3aa88a9c7244eb704d9c325e801c97022b3e514288ee9e6b82496dd17f29a7 not found: ID does not exist" Apr 16 14:20:38.728718 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:38.728691 2575 scope.go:117] "RemoveContainer" containerID="c491ca09c9ace21d95f593e0c97010d91fb7dcf70204432aab10c96c1c25a91b" Apr 16 14:20:38.728981 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:20:38.728959 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c491ca09c9ace21d95f593e0c97010d91fb7dcf70204432aab10c96c1c25a91b\": container with ID starting with c491ca09c9ace21d95f593e0c97010d91fb7dcf70204432aab10c96c1c25a91b not found: ID does not exist" containerID="c491ca09c9ace21d95f593e0c97010d91fb7dcf70204432aab10c96c1c25a91b" Apr 16 14:20:38.729045 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:38.728992 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c491ca09c9ace21d95f593e0c97010d91fb7dcf70204432aab10c96c1c25a91b"} err="failed to get container status \"c491ca09c9ace21d95f593e0c97010d91fb7dcf70204432aab10c96c1c25a91b\": rpc error: code = NotFound desc = could not find container \"c491ca09c9ace21d95f593e0c97010d91fb7dcf70204432aab10c96c1c25a91b\": container with ID starting with c491ca09c9ace21d95f593e0c97010d91fb7dcf70204432aab10c96c1c25a91b not found: ID does not exist" Apr 16 14:20:38.738327 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:38.738293 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr"] Apr 16 14:20:38.741910 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:38.741882 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-eca71-predictor-df48fb76-48pcr"] Apr 16 14:20:39.717417 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:39.717381 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889" event={"ID":"5d1fcbff-1731-4fef-a12c-01571e049a76","Type":"ContainerStarted","Data":"af9173dc1d41252a26e785bdc7e0eb7bda29f7d0cf497328860af0128c50841e"} Apr 16 14:20:39.717887 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:39.717666 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889" Apr 16 14:20:39.719228 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:39.719202 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889" podUID="5d1fcbff-1731-4fef-a12c-01571e049a76" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 14:20:39.734772 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:39.734703 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889" podStartSLOduration=6.734688757 podStartE2EDuration="6.734688757s" podCreationTimestamp="2026-04-16 14:20:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:20:39.732576086 +0000 UTC m=+1413.910824953" watchObservedRunningTime="2026-04-16 14:20:39.734688757 +0000 UTC m=+1413.912937625" Apr 16 14:20:40.390439 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:40.390405 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f" path="/var/lib/kubelet/pods/3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f/volumes" Apr 16 14:20:40.720952 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:40.720854 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889" podUID="5d1fcbff-1731-4fef-a12c-01571e049a76" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 14:20:50.721240 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:20:50.721198 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889" podUID="5d1fcbff-1731-4fef-a12c-01571e049a76" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 14:21:00.721936 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:00.721842 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889" podUID="5d1fcbff-1731-4fef-a12c-01571e049a76" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 14:21:10.721174 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:10.721120 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889" podUID="5d1fcbff-1731-4fef-a12c-01571e049a76" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 14:21:20.721710 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:20.721650 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889" podUID="5d1fcbff-1731-4fef-a12c-01571e049a76" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 14:21:30.721126 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:30.721082 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889" podUID="5d1fcbff-1731-4fef-a12c-01571e049a76" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 14:21:40.721428 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:40.721382 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889" podUID="5d1fcbff-1731-4fef-a12c-01571e049a76" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 14:21:50.722813 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:50.722777 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889" Apr 16 14:21:53.964640 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:53.964602 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889"] Apr 16 14:21:53.965075 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:53.964882 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889" podUID="5d1fcbff-1731-4fef-a12c-01571e049a76" containerName="kserve-container" containerID="cri-o://af9173dc1d41252a26e785bdc7e0eb7bda29f7d0cf497328860af0128c50841e" gracePeriod=30 Apr 16 14:21:55.201244 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:55.201202 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kxdfc/must-gather-kfxjf"] Apr 16 14:21:55.201717 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:55.201645 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f" containerName="storage-initializer" Apr 16 14:21:55.201717 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:55.201676 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f" containerName="storage-initializer" Apr 16 14:21:55.201717 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:55.201700 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f" containerName="kserve-container" Apr 16 14:21:55.201717 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:55.201708 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f" containerName="kserve-container" Apr 16 14:21:55.201989 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:55.201808 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a0cbfcd-168e-433a-9b1b-c7d1338f9d9f" containerName="kserve-container" Apr 16 14:21:55.204916 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:55.204894 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kxdfc/must-gather-kfxjf" Apr 16 14:21:55.207637 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:55.207612 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kxdfc\"/\"openshift-service-ca.crt\"" Apr 16 14:21:55.207743 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:55.207648 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-kxdfc\"/\"default-dockercfg-4ll8z\"" Apr 16 14:21:55.207743 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:55.207651 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kxdfc\"/\"kube-root-ca.crt\"" Apr 16 14:21:55.214802 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:55.214778 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kxdfc/must-gather-kfxjf"] Apr 16 14:21:55.362824 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:55.362786 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxc78\" (UniqueName: \"kubernetes.io/projected/ad49946b-adff-45ab-b8ca-ba83f7e48082-kube-api-access-jxc78\") pod \"must-gather-kfxjf\" (UID: \"ad49946b-adff-45ab-b8ca-ba83f7e48082\") " pod="openshift-must-gather-kxdfc/must-gather-kfxjf" Apr 16 14:21:55.362824 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:55.362829 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ad49946b-adff-45ab-b8ca-ba83f7e48082-must-gather-output\") pod \"must-gather-kfxjf\" (UID: \"ad49946b-adff-45ab-b8ca-ba83f7e48082\") " pod="openshift-must-gather-kxdfc/must-gather-kfxjf" Apr 16 14:21:55.463773 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:55.463644 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxc78\" (UniqueName: \"kubernetes.io/projected/ad49946b-adff-45ab-b8ca-ba83f7e48082-kube-api-access-jxc78\") pod \"must-gather-kfxjf\" (UID: \"ad49946b-adff-45ab-b8ca-ba83f7e48082\") " pod="openshift-must-gather-kxdfc/must-gather-kfxjf" Apr 16 14:21:55.463940 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:55.463797 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ad49946b-adff-45ab-b8ca-ba83f7e48082-must-gather-output\") pod \"must-gather-kfxjf\" (UID: \"ad49946b-adff-45ab-b8ca-ba83f7e48082\") " pod="openshift-must-gather-kxdfc/must-gather-kfxjf" Apr 16 14:21:55.464172 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:55.464154 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ad49946b-adff-45ab-b8ca-ba83f7e48082-must-gather-output\") pod \"must-gather-kfxjf\" (UID: \"ad49946b-adff-45ab-b8ca-ba83f7e48082\") " pod="openshift-must-gather-kxdfc/must-gather-kfxjf" Apr 16 14:21:55.472179 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:55.472146 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxc78\" (UniqueName: \"kubernetes.io/projected/ad49946b-adff-45ab-b8ca-ba83f7e48082-kube-api-access-jxc78\") pod \"must-gather-kfxjf\" (UID: \"ad49946b-adff-45ab-b8ca-ba83f7e48082\") " pod="openshift-must-gather-kxdfc/must-gather-kfxjf" Apr 16 14:21:55.526881 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:55.526840 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kxdfc/must-gather-kfxjf" Apr 16 14:21:55.652388 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:55.652289 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kxdfc/must-gather-kfxjf"] Apr 16 14:21:55.655341 ip-10-0-141-131 kubenswrapper[2575]: W0416 14:21:55.655313 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad49946b_adff_45ab_b8ca_ba83f7e48082.slice/crio-752b297d51f1de701f2de5a0ef0f7c9975af46e92a552874da2e53c2064be933 WatchSource:0}: Error finding container 752b297d51f1de701f2de5a0ef0f7c9975af46e92a552874da2e53c2064be933: Status 404 returned error can't find the container with id 752b297d51f1de701f2de5a0ef0f7c9975af46e92a552874da2e53c2064be933 Apr 16 14:21:55.946671 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:55.946636 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kxdfc/must-gather-kfxjf" event={"ID":"ad49946b-adff-45ab-b8ca-ba83f7e48082","Type":"ContainerStarted","Data":"752b297d51f1de701f2de5a0ef0f7c9975af46e92a552874da2e53c2064be933"} Apr 16 14:21:59.962196 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:59.962161 2575 generic.go:358] "Generic (PLEG): container finished" podID="5d1fcbff-1731-4fef-a12c-01571e049a76" containerID="af9173dc1d41252a26e785bdc7e0eb7bda29f7d0cf497328860af0128c50841e" exitCode=0 Apr 16 14:21:59.962622 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:21:59.962245 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889" event={"ID":"5d1fcbff-1731-4fef-a12c-01571e049a76","Type":"ContainerDied","Data":"af9173dc1d41252a26e785bdc7e0eb7bda29f7d0cf497328860af0128c50841e"} Apr 16 14:22:00.323973 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:00.323950 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889" Apr 16 14:22:00.407036 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:00.406985 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d1fcbff-1731-4fef-a12c-01571e049a76-kserve-provision-location\") pod \"5d1fcbff-1731-4fef-a12c-01571e049a76\" (UID: \"5d1fcbff-1731-4fef-a12c-01571e049a76\") " Apr 16 14:22:00.407405 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:00.407372 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d1fcbff-1731-4fef-a12c-01571e049a76-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5d1fcbff-1731-4fef-a12c-01571e049a76" (UID: "5d1fcbff-1731-4fef-a12c-01571e049a76"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:22:00.508474 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:00.508425 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d1fcbff-1731-4fef-a12c-01571e049a76-kserve-provision-location\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:22:00.967607 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:00.967565 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kxdfc/must-gather-kfxjf" event={"ID":"ad49946b-adff-45ab-b8ca-ba83f7e48082","Type":"ContainerStarted","Data":"9bee906373e3a1fe0458379c64b144910648eae48804fd75f543b222c6dad521"} Apr 16 14:22:00.968101 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:00.967617 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kxdfc/must-gather-kfxjf" event={"ID":"ad49946b-adff-45ab-b8ca-ba83f7e48082","Type":"ContainerStarted","Data":"3107bf4b4c61bd6cafc6ba326714cbd10ea19a5cc86af0074355d6553b4e2755"} Apr 16 14:22:00.969191 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:00.969159 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889" event={"ID":"5d1fcbff-1731-4fef-a12c-01571e049a76","Type":"ContainerDied","Data":"e95e365c1c5bc078739fcc67877141b4d44f507a7e2f72d7e1fe3dc0731c4434"} Apr 16 14:22:00.969334 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:00.969200 2575 scope.go:117] "RemoveContainer" containerID="af9173dc1d41252a26e785bdc7e0eb7bda29f7d0cf497328860af0128c50841e" Apr 16 14:22:00.969334 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:00.969206 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889" Apr 16 14:22:00.979622 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:00.979603 2575 scope.go:117] "RemoveContainer" containerID="ca7887ae697a436950ab36b9dbc6734a9fd238225acbf21d72ab4ddcad4432c0" Apr 16 14:22:00.982400 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:00.982344 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kxdfc/must-gather-kfxjf" podStartSLOduration=1.0410709 podStartE2EDuration="5.982327379s" podCreationTimestamp="2026-04-16 14:21:55 +0000 UTC" firstStartedPulling="2026-04-16 14:21:55.657699878 +0000 UTC m=+1489.835948739" lastFinishedPulling="2026-04-16 14:22:00.598956358 +0000 UTC m=+1494.777205218" observedRunningTime="2026-04-16 14:22:00.981836658 +0000 UTC m=+1495.160085529" watchObservedRunningTime="2026-04-16 14:22:00.982327379 +0000 UTC m=+1495.160576251" Apr 16 14:22:00.993983 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:00.993953 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889"] Apr 16 14:22:01.001036 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:01.001006 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-16ff5-predictor-d968b69d4-q7889"] Apr 16 14:22:02.390691 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:02.390647 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d1fcbff-1731-4fef-a12c-01571e049a76" path="/var/lib/kubelet/pods/5d1fcbff-1731-4fef-a12c-01571e049a76/volumes" Apr 16 14:22:06.370040 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:06.370011 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zxb7l_24046d5b-b6df-4005-85d5-01cafc82cc40/console-operator/2.log" Apr 16 14:22:06.372328 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:06.372304 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zxb7l_24046d5b-b6df-4005-85d5-01cafc82cc40/console-operator/2.log" Apr 16 14:22:19.028603 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:19.028568 2575 generic.go:358] "Generic (PLEG): container finished" podID="ad49946b-adff-45ab-b8ca-ba83f7e48082" containerID="3107bf4b4c61bd6cafc6ba326714cbd10ea19a5cc86af0074355d6553b4e2755" exitCode=0 Apr 16 14:22:19.029039 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:19.028635 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kxdfc/must-gather-kfxjf" event={"ID":"ad49946b-adff-45ab-b8ca-ba83f7e48082","Type":"ContainerDied","Data":"3107bf4b4c61bd6cafc6ba326714cbd10ea19a5cc86af0074355d6553b4e2755"} Apr 16 14:22:19.029039 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:19.028972 2575 scope.go:117] "RemoveContainer" containerID="3107bf4b4c61bd6cafc6ba326714cbd10ea19a5cc86af0074355d6553b4e2755" Apr 16 14:22:19.805196 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:19.805170 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kxdfc_must-gather-kfxjf_ad49946b-adff-45ab-b8ca-ba83f7e48082/gather/0.log" Apr 16 14:22:20.335197 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:20.335163 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-s6bn5/must-gather-hqkxc"] Apr 16 14:22:20.335598 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:20.335487 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d1fcbff-1731-4fef-a12c-01571e049a76" containerName="storage-initializer" Apr 16 14:22:20.335598 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:20.335498 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d1fcbff-1731-4fef-a12c-01571e049a76" containerName="storage-initializer" Apr 16 14:22:20.335598 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:20.335515 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d1fcbff-1731-4fef-a12c-01571e049a76" containerName="kserve-container" Apr 16 14:22:20.335598 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:20.335521 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d1fcbff-1731-4fef-a12c-01571e049a76" containerName="kserve-container" Apr 16 14:22:20.335598 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:20.335567 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d1fcbff-1731-4fef-a12c-01571e049a76" containerName="kserve-container" Apr 16 14:22:20.338625 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:20.338604 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s6bn5/must-gather-hqkxc" Apr 16 14:22:20.340893 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:20.340869 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-s6bn5\"/\"kube-root-ca.crt\"" Apr 16 14:22:20.341003 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:20.340926 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-s6bn5\"/\"openshift-service-ca.crt\"" Apr 16 14:22:20.341862 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:20.341847 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-s6bn5\"/\"default-dockercfg-nnrnd\"" Apr 16 14:22:20.346115 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:20.346090 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-s6bn5/must-gather-hqkxc"] Apr 16 14:22:20.378548 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:20.378512 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk2fq\" (UniqueName: \"kubernetes.io/projected/01369e6b-6ad7-4df3-a695-d7d3dde2c4f5-kube-api-access-gk2fq\") pod \"must-gather-hqkxc\" (UID: \"01369e6b-6ad7-4df3-a695-d7d3dde2c4f5\") " pod="openshift-must-gather-s6bn5/must-gather-hqkxc" Apr 16 14:22:20.378715 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:20.378574 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/01369e6b-6ad7-4df3-a695-d7d3dde2c4f5-must-gather-output\") pod \"must-gather-hqkxc\" (UID: \"01369e6b-6ad7-4df3-a695-d7d3dde2c4f5\") " pod="openshift-must-gather-s6bn5/must-gather-hqkxc" Apr 16 14:22:20.479074 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:20.479034 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gk2fq\" (UniqueName: \"kubernetes.io/projected/01369e6b-6ad7-4df3-a695-d7d3dde2c4f5-kube-api-access-gk2fq\") pod \"must-gather-hqkxc\" (UID: \"01369e6b-6ad7-4df3-a695-d7d3dde2c4f5\") " pod="openshift-must-gather-s6bn5/must-gather-hqkxc" Apr 16 14:22:20.479254 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:20.479094 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/01369e6b-6ad7-4df3-a695-d7d3dde2c4f5-must-gather-output\") pod \"must-gather-hqkxc\" (UID: \"01369e6b-6ad7-4df3-a695-d7d3dde2c4f5\") " pod="openshift-must-gather-s6bn5/must-gather-hqkxc" Apr 16 14:22:20.479375 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:20.479361 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/01369e6b-6ad7-4df3-a695-d7d3dde2c4f5-must-gather-output\") pod \"must-gather-hqkxc\" (UID: \"01369e6b-6ad7-4df3-a695-d7d3dde2c4f5\") " pod="openshift-must-gather-s6bn5/must-gather-hqkxc" Apr 16 14:22:20.489018 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:20.488983 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk2fq\" (UniqueName: \"kubernetes.io/projected/01369e6b-6ad7-4df3-a695-d7d3dde2c4f5-kube-api-access-gk2fq\") pod \"must-gather-hqkxc\" (UID: \"01369e6b-6ad7-4df3-a695-d7d3dde2c4f5\") " pod="openshift-must-gather-s6bn5/must-gather-hqkxc" Apr 16 14:22:20.648058 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:20.648030 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s6bn5/must-gather-hqkxc" Apr 16 14:22:20.768793 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:20.768707 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-s6bn5/must-gather-hqkxc"] Apr 16 14:22:20.771132 ip-10-0-141-131 kubenswrapper[2575]: W0416 14:22:20.771100 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01369e6b_6ad7_4df3_a695_d7d3dde2c4f5.slice/crio-2a3f1d62f0252b05345e383f9463909551e0f70b01679a90528045ce3c4932a6 WatchSource:0}: Error finding container 2a3f1d62f0252b05345e383f9463909551e0f70b01679a90528045ce3c4932a6: Status 404 returned error can't find the container with id 2a3f1d62f0252b05345e383f9463909551e0f70b01679a90528045ce3c4932a6 Apr 16 14:22:21.036701 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:21.036610 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s6bn5/must-gather-hqkxc" event={"ID":"01369e6b-6ad7-4df3-a695-d7d3dde2c4f5","Type":"ContainerStarted","Data":"2a3f1d62f0252b05345e383f9463909551e0f70b01679a90528045ce3c4932a6"} Apr 16 14:22:22.043629 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:22.042837 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s6bn5/must-gather-hqkxc" event={"ID":"01369e6b-6ad7-4df3-a695-d7d3dde2c4f5","Type":"ContainerStarted","Data":"fe7246ec2a1f33cb672d62ceb21ea07d30b421f091c808f80b7ea7522943deae"} Apr 16 14:22:22.043629 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:22.042883 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s6bn5/must-gather-hqkxc" event={"ID":"01369e6b-6ad7-4df3-a695-d7d3dde2c4f5","Type":"ContainerStarted","Data":"8cff567e25be2d4f76eb13e2716178f682b312f8996c29c840fc9bcb50581bd5"} Apr 16 14:22:22.058925 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:22.058865 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-s6bn5/must-gather-hqkxc" podStartSLOduration=1.148851673 podStartE2EDuration="2.058845426s" podCreationTimestamp="2026-04-16 14:22:20 +0000 UTC" firstStartedPulling="2026-04-16 14:22:20.772832554 +0000 UTC m=+1514.951081400" lastFinishedPulling="2026-04-16 14:22:21.682826305 +0000 UTC m=+1515.861075153" observedRunningTime="2026-04-16 14:22:22.058067054 +0000 UTC m=+1516.236315922" watchObservedRunningTime="2026-04-16 14:22:22.058845426 +0000 UTC m=+1516.237094295" Apr 16 14:22:23.165670 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:23.165621 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-5wxx6_13c9904e-5fed-4ef4-845b-0a77e68bc8f7/global-pull-secret-syncer/0.log" Apr 16 14:22:23.300894 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:23.300843 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-cr2k7_fa435174-0b32-4191-b42c-ad32bd3bc5db/konnectivity-agent/0.log" Apr 16 14:22:23.399657 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:23.399621 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-131.ec2.internal_d8d2889d114a061679832b8c70f242a6/haproxy/0.log" Apr 16 14:22:25.178410 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:25.178372 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kxdfc/must-gather-kfxjf"] Apr 16 14:22:25.179529 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:25.179497 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-kxdfc/must-gather-kfxjf" podUID="ad49946b-adff-45ab-b8ca-ba83f7e48082" containerName="copy" containerID="cri-o://9bee906373e3a1fe0458379c64b144910648eae48804fd75f543b222c6dad521" gracePeriod=2 Apr 16 14:22:25.181548 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:25.181522 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kxdfc/must-gather-kfxjf"] Apr 16 14:22:25.181947 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:25.181916 2575 status_manager.go:895] "Failed to get status for pod" podUID="ad49946b-adff-45ab-b8ca-ba83f7e48082" pod="openshift-must-gather-kxdfc/must-gather-kfxjf" err="pods \"must-gather-kfxjf\" is forbidden: User \"system:node:ip-10-0-141-131.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kxdfc\": no relationship found between node 'ip-10-0-141-131.ec2.internal' and this object" Apr 16 14:22:25.555070 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:25.554824 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kxdfc_must-gather-kfxjf_ad49946b-adff-45ab-b8ca-ba83f7e48082/copy/0.log" Apr 16 14:22:25.555763 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:25.555452 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kxdfc/must-gather-kfxjf" Apr 16 14:22:25.558938 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:25.558889 2575 status_manager.go:895] "Failed to get status for pod" podUID="ad49946b-adff-45ab-b8ca-ba83f7e48082" pod="openshift-must-gather-kxdfc/must-gather-kfxjf" err="pods \"must-gather-kfxjf\" is forbidden: User \"system:node:ip-10-0-141-131.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kxdfc\": no relationship found between node 'ip-10-0-141-131.ec2.internal' and this object" Apr 16 14:22:25.630478 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:25.629919 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ad49946b-adff-45ab-b8ca-ba83f7e48082-must-gather-output\") pod \"ad49946b-adff-45ab-b8ca-ba83f7e48082\" (UID: \"ad49946b-adff-45ab-b8ca-ba83f7e48082\") " Apr 16 14:22:25.630478 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:25.630011 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxc78\" (UniqueName: \"kubernetes.io/projected/ad49946b-adff-45ab-b8ca-ba83f7e48082-kube-api-access-jxc78\") pod \"ad49946b-adff-45ab-b8ca-ba83f7e48082\" (UID: \"ad49946b-adff-45ab-b8ca-ba83f7e48082\") " Apr 16 14:22:25.632751 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:25.632685 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad49946b-adff-45ab-b8ca-ba83f7e48082-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ad49946b-adff-45ab-b8ca-ba83f7e48082" (UID: "ad49946b-adff-45ab-b8ca-ba83f7e48082"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:22:25.637612 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:25.637576 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad49946b-adff-45ab-b8ca-ba83f7e48082-kube-api-access-jxc78" (OuterVolumeSpecName: "kube-api-access-jxc78") pod "ad49946b-adff-45ab-b8ca-ba83f7e48082" (UID: "ad49946b-adff-45ab-b8ca-ba83f7e48082"). InnerVolumeSpecName "kube-api-access-jxc78". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:22:25.731057 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:25.730986 2575 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ad49946b-adff-45ab-b8ca-ba83f7e48082-must-gather-output\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:22:25.731057 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:25.731028 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jxc78\" (UniqueName: \"kubernetes.io/projected/ad49946b-adff-45ab-b8ca-ba83f7e48082-kube-api-access-jxc78\") on node \"ip-10-0-141-131.ec2.internal\" DevicePath \"\"" Apr 16 14:22:26.070597 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:26.070507 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kxdfc_must-gather-kfxjf_ad49946b-adff-45ab-b8ca-ba83f7e48082/copy/0.log" Apr 16 14:22:26.075716 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:26.071360 2575 generic.go:358] "Generic (PLEG): container finished" podID="ad49946b-adff-45ab-b8ca-ba83f7e48082" containerID="9bee906373e3a1fe0458379c64b144910648eae48804fd75f543b222c6dad521" exitCode=143 Apr 16 14:22:26.075716 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:26.071452 2575 scope.go:117] "RemoveContainer" containerID="9bee906373e3a1fe0458379c64b144910648eae48804fd75f543b222c6dad521" Apr 16 14:22:26.075716 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:26.071588 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kxdfc/must-gather-kfxjf" Apr 16 14:22:26.076988 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:26.076934 2575 status_manager.go:895] "Failed to get status for pod" podUID="ad49946b-adff-45ab-b8ca-ba83f7e48082" pod="openshift-must-gather-kxdfc/must-gather-kfxjf" err="pods \"must-gather-kfxjf\" is forbidden: User \"system:node:ip-10-0-141-131.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kxdfc\": no relationship found between node 'ip-10-0-141-131.ec2.internal' and this object" Apr 16 14:22:26.087500 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:26.087173 2575 scope.go:117] "RemoveContainer" containerID="3107bf4b4c61bd6cafc6ba326714cbd10ea19a5cc86af0074355d6553b4e2755" Apr 16 14:22:26.092357 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:26.092325 2575 status_manager.go:895] "Failed to get status for pod" podUID="ad49946b-adff-45ab-b8ca-ba83f7e48082" pod="openshift-must-gather-kxdfc/must-gather-kfxjf" err="pods \"must-gather-kfxjf\" is forbidden: User \"system:node:ip-10-0-141-131.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kxdfc\": no relationship found between node 'ip-10-0-141-131.ec2.internal' and this object" Apr 16 14:22:26.106094 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:26.105789 2575 scope.go:117] "RemoveContainer" containerID="9bee906373e3a1fe0458379c64b144910648eae48804fd75f543b222c6dad521" Apr 16 14:22:26.106598 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:22:26.106329 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bee906373e3a1fe0458379c64b144910648eae48804fd75f543b222c6dad521\": container with ID starting with 9bee906373e3a1fe0458379c64b144910648eae48804fd75f543b222c6dad521 not found: ID does not exist" containerID="9bee906373e3a1fe0458379c64b144910648eae48804fd75f543b222c6dad521" Apr 16 14:22:26.106598 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:26.106364 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bee906373e3a1fe0458379c64b144910648eae48804fd75f543b222c6dad521"} err="failed to get container status \"9bee906373e3a1fe0458379c64b144910648eae48804fd75f543b222c6dad521\": rpc error: code = NotFound desc = could not find container \"9bee906373e3a1fe0458379c64b144910648eae48804fd75f543b222c6dad521\": container with ID starting with 9bee906373e3a1fe0458379c64b144910648eae48804fd75f543b222c6dad521 not found: ID does not exist" Apr 16 14:22:26.106598 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:26.106390 2575 scope.go:117] "RemoveContainer" containerID="3107bf4b4c61bd6cafc6ba326714cbd10ea19a5cc86af0074355d6553b4e2755" Apr 16 14:22:26.107215 ip-10-0-141-131 kubenswrapper[2575]: E0416 14:22:26.107121 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3107bf4b4c61bd6cafc6ba326714cbd10ea19a5cc86af0074355d6553b4e2755\": container with ID starting with 3107bf4b4c61bd6cafc6ba326714cbd10ea19a5cc86af0074355d6553b4e2755 not found: ID does not exist" containerID="3107bf4b4c61bd6cafc6ba326714cbd10ea19a5cc86af0074355d6553b4e2755" Apr 16 14:22:26.107215 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:26.107157 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3107bf4b4c61bd6cafc6ba326714cbd10ea19a5cc86af0074355d6553b4e2755"} err="failed to get container status \"3107bf4b4c61bd6cafc6ba326714cbd10ea19a5cc86af0074355d6553b4e2755\": rpc error: code = NotFound desc = could not find container \"3107bf4b4c61bd6cafc6ba326714cbd10ea19a5cc86af0074355d6553b4e2755\": container with ID starting with 3107bf4b4c61bd6cafc6ba326714cbd10ea19a5cc86af0074355d6553b4e2755 not found: ID does not exist" Apr 16 14:22:26.396027 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:26.395963 2575 status_manager.go:895] "Failed to get status for pod" podUID="ad49946b-adff-45ab-b8ca-ba83f7e48082" pod="openshift-must-gather-kxdfc/must-gather-kfxjf" err="pods \"must-gather-kfxjf\" is forbidden: User \"system:node:ip-10-0-141-131.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kxdfc\": no relationship found between node 'ip-10-0-141-131.ec2.internal' and this object" Apr 16 14:22:26.398411 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:26.398372 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad49946b-adff-45ab-b8ca-ba83f7e48082" path="/var/lib/kubelet/pods/ad49946b-adff-45ab-b8ca-ba83f7e48082/volumes" Apr 16 14:22:27.515286 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:27.515179 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-pn2zh_a3cc9243-9cb5-4af0-8004-572a25c168d1/monitoring-plugin/0.log" Apr 16 14:22:27.549632 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:27.549580 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4dhrq_21d470a0-3408-42d5-a74e-d66c383570a4/node-exporter/0.log" Apr 16 14:22:27.576568 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:27.576503 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4dhrq_21d470a0-3408-42d5-a74e-d66c383570a4/kube-rbac-proxy/0.log" Apr 16 14:22:27.602880 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:27.602845 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4dhrq_21d470a0-3408-42d5-a74e-d66c383570a4/init-textfile/0.log" Apr 16 14:22:27.891559 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:27.891529 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e40b9af4-afc4-44d6-8e70-b76029a57119/prometheus/0.log" Apr 16 14:22:27.910936 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:27.910908 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e40b9af4-afc4-44d6-8e70-b76029a57119/config-reloader/0.log" Apr 16 14:22:27.936641 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:27.936612 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e40b9af4-afc4-44d6-8e70-b76029a57119/thanos-sidecar/0.log" Apr 16 14:22:27.963247 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:27.963219 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e40b9af4-afc4-44d6-8e70-b76029a57119/kube-rbac-proxy-web/0.log" Apr 16 14:22:27.988060 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:27.988013 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e40b9af4-afc4-44d6-8e70-b76029a57119/kube-rbac-proxy/0.log" Apr 16 14:22:28.012183 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:28.012156 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e40b9af4-afc4-44d6-8e70-b76029a57119/kube-rbac-proxy-thanos/0.log" Apr 16 14:22:28.039025 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:28.038982 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e40b9af4-afc4-44d6-8e70-b76029a57119/init-config-reloader/0.log" Apr 16 14:22:28.073973 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:28.073938 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-xhcvf_ea940c5e-d3a3-4fb9-8758-aaada6c8070d/prometheus-operator/0.log" Apr 16 14:22:28.100166 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:28.100132 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-xhcvf_ea940c5e-d3a3-4fb9-8758-aaada6c8070d/kube-rbac-proxy/0.log" Apr 16 14:22:29.527379 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:29.527350 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-6mm5s_e84621c2-6f3b-487c-8a13-426a6d91539c/networking-console-plugin/0.log" Apr 16 14:22:29.935713 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:29.935682 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zxb7l_24046d5b-b6df-4005-85d5-01cafc82cc40/console-operator/2.log" Apr 16 14:22:29.940700 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:29.940667 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zxb7l_24046d5b-b6df-4005-85d5-01cafc82cc40/console-operator/3.log" Apr 16 14:22:30.498783 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:30.498752 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-s6bn5/perf-node-gather-daemonset-qlf5h"] Apr 16 14:22:30.499127 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:30.499110 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad49946b-adff-45ab-b8ca-ba83f7e48082" containerName="copy" Apr 16 14:22:30.499215 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:30.499129 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad49946b-adff-45ab-b8ca-ba83f7e48082" containerName="copy" Apr 16 14:22:30.499215 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:30.499170 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad49946b-adff-45ab-b8ca-ba83f7e48082" containerName="gather" Apr 16 14:22:30.499215 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:30.499180 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad49946b-adff-45ab-b8ca-ba83f7e48082" containerName="gather" Apr 16 14:22:30.499380 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:30.499254 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad49946b-adff-45ab-b8ca-ba83f7e48082" containerName="copy" Apr 16 14:22:30.499380 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:30.499267 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad49946b-adff-45ab-b8ca-ba83f7e48082" containerName="gather" Apr 16 14:22:30.502641 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:30.502620 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-qlf5h" Apr 16 14:22:30.511508 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:30.511482 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-s6bn5/perf-node-gather-daemonset-qlf5h"] Apr 16 14:22:30.578979 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:30.578941 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c6bd892-4b73-433b-bae9-2a0fe275e548-sys\") pod \"perf-node-gather-daemonset-qlf5h\" (UID: \"4c6bd892-4b73-433b-bae9-2a0fe275e548\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-qlf5h" Apr 16 14:22:30.579442 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:30.578992 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4c6bd892-4b73-433b-bae9-2a0fe275e548-podres\") pod \"perf-node-gather-daemonset-qlf5h\" (UID: \"4c6bd892-4b73-433b-bae9-2a0fe275e548\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-qlf5h" Apr 16 14:22:30.579442 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:30.579078 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c6bd892-4b73-433b-bae9-2a0fe275e548-lib-modules\") pod \"perf-node-gather-daemonset-qlf5h\" (UID: \"4c6bd892-4b73-433b-bae9-2a0fe275e548\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-qlf5h" Apr 16 14:22:30.579442 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:30.579143 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4c6bd892-4b73-433b-bae9-2a0fe275e548-proc\") pod \"perf-node-gather-daemonset-qlf5h\" (UID: \"4c6bd892-4b73-433b-bae9-2a0fe275e548\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-qlf5h" Apr 16 14:22:30.579442 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:30.579173 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f65cc\" (UniqueName: \"kubernetes.io/projected/4c6bd892-4b73-433b-bae9-2a0fe275e548-kube-api-access-f65cc\") pod \"perf-node-gather-daemonset-qlf5h\" (UID: \"4c6bd892-4b73-433b-bae9-2a0fe275e548\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-qlf5h" Apr 16 14:22:30.680045 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:30.680011 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c6bd892-4b73-433b-bae9-2a0fe275e548-sys\") pod \"perf-node-gather-daemonset-qlf5h\" (UID: \"4c6bd892-4b73-433b-bae9-2a0fe275e548\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-qlf5h" Apr 16 14:22:30.680239 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:30.680054 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4c6bd892-4b73-433b-bae9-2a0fe275e548-podres\") pod \"perf-node-gather-daemonset-qlf5h\" (UID: \"4c6bd892-4b73-433b-bae9-2a0fe275e548\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-qlf5h" Apr 16 14:22:30.680239 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:30.680091 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c6bd892-4b73-433b-bae9-2a0fe275e548-lib-modules\") pod \"perf-node-gather-daemonset-qlf5h\" (UID: \"4c6bd892-4b73-433b-bae9-2a0fe275e548\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-qlf5h" Apr 16 14:22:30.680239 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:30.680124 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4c6bd892-4b73-433b-bae9-2a0fe275e548-proc\") pod \"perf-node-gather-daemonset-qlf5h\" (UID: \"4c6bd892-4b73-433b-bae9-2a0fe275e548\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-qlf5h" Apr 16 14:22:30.680239 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:30.680133 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c6bd892-4b73-433b-bae9-2a0fe275e548-sys\") pod \"perf-node-gather-daemonset-qlf5h\" (UID: \"4c6bd892-4b73-433b-bae9-2a0fe275e548\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-qlf5h" Apr 16 14:22:30.680239 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:30.680155 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f65cc\" (UniqueName: \"kubernetes.io/projected/4c6bd892-4b73-433b-bae9-2a0fe275e548-kube-api-access-f65cc\") pod \"perf-node-gather-daemonset-qlf5h\" (UID: \"4c6bd892-4b73-433b-bae9-2a0fe275e548\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-qlf5h" Apr 16 14:22:30.680239 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:30.680220 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4c6bd892-4b73-433b-bae9-2a0fe275e548-proc\") pod \"perf-node-gather-daemonset-qlf5h\" (UID: \"4c6bd892-4b73-433b-bae9-2a0fe275e548\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-qlf5h" Apr 16 14:22:30.680239 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:30.680239 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c6bd892-4b73-433b-bae9-2a0fe275e548-lib-modules\") pod \"perf-node-gather-daemonset-qlf5h\" (UID: \"4c6bd892-4b73-433b-bae9-2a0fe275e548\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-qlf5h" Apr 16 14:22:30.680482 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:30.680243 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4c6bd892-4b73-433b-bae9-2a0fe275e548-podres\") pod \"perf-node-gather-daemonset-qlf5h\" (UID: \"4c6bd892-4b73-433b-bae9-2a0fe275e548\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-qlf5h" Apr 16 14:22:30.688854 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:30.688821 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f65cc\" (UniqueName: \"kubernetes.io/projected/4c6bd892-4b73-433b-bae9-2a0fe275e548-kube-api-access-f65cc\") pod \"perf-node-gather-daemonset-qlf5h\" (UID: \"4c6bd892-4b73-433b-bae9-2a0fe275e548\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-qlf5h" Apr 16 14:22:30.746079 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:30.746042 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-rqmhf_331d1069-9cb3-438e-a5bd-46015afcf351/volume-data-source-validator/0.log" Apr 16 14:22:30.816020 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:30.815917 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-qlf5h" Apr 16 14:22:30.952320 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:30.952253 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-s6bn5/perf-node-gather-daemonset-qlf5h"] Apr 16 14:22:30.957791 ip-10-0-141-131 kubenswrapper[2575]: W0416 14:22:30.957756 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4c6bd892_4b73_433b_bae9_2a0fe275e548.slice/crio-cc5406ebc9d99ef7cbdab4aa00147d1a1bc5c52ba1c0d316f39754e5a14209e9 WatchSource:0}: Error finding container cc5406ebc9d99ef7cbdab4aa00147d1a1bc5c52ba1c0d316f39754e5a14209e9: Status 404 returned error can't find the container with id cc5406ebc9d99ef7cbdab4aa00147d1a1bc5c52ba1c0d316f39754e5a14209e9 Apr 16 14:22:31.089700 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:31.089658 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-qlf5h" event={"ID":"4c6bd892-4b73-433b-bae9-2a0fe275e548","Type":"ContainerStarted","Data":"9de069ff9ed427a3d272b5b2ec849892857ff8583a8e0789cad267aa13ab49ff"} Apr 16 14:22:31.089700 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:31.089696 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-qlf5h" event={"ID":"4c6bd892-4b73-433b-bae9-2a0fe275e548","Type":"ContainerStarted","Data":"cc5406ebc9d99ef7cbdab4aa00147d1a1bc5c52ba1c0d316f39754e5a14209e9"} Apr 16 14:22:31.107632 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:31.106667 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-qlf5h" podStartSLOduration=1.106644966 podStartE2EDuration="1.106644966s" podCreationTimestamp="2026-04-16 14:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:22:31.105441278 +0000 UTC m=+1525.283690149" watchObservedRunningTime="2026-04-16 14:22:31.106644966 +0000 UTC m=+1525.284893835" Apr 16 14:22:31.439244 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:31.439215 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-g5lb6_92799234-6fff-45be-a27c-c70096483d30/dns/0.log" Apr 16 14:22:31.463381 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:31.463348 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-g5lb6_92799234-6fff-45be-a27c-c70096483d30/kube-rbac-proxy/0.log" Apr 16 14:22:31.565112 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:31.565081 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hvqjp_1b680da6-ab85-4c31-98d8-35be4b07624b/dns-node-resolver/0.log" Apr 16 14:22:32.034745 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:32.034700 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-89jl9_cb697396-e88e-4780-9f6a-2109bfc21e0f/node-ca/0.log" Apr 16 14:22:32.092476 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:32.092445 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-qlf5h" Apr 16 14:22:33.125470 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:33.125443 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-hgrw6_b77dfd1a-6bd6-449b-8db3-c93ff41eb18e/serve-healthcheck-canary/0.log" Apr 16 14:22:33.532244 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:33.532138 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-5zl6t_18e2e1da-09fc-4969-99a4-1d53b1a12d83/insights-operator/0.log" Apr 16 14:22:33.534120 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:33.534095 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-5zl6t_18e2e1da-09fc-4969-99a4-1d53b1a12d83/insights-operator/1.log" Apr 16 14:22:33.557773 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:33.557747 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kjxps_39530f63-5235-4a51-ade7-9af0431be21f/kube-rbac-proxy/0.log" Apr 16 14:22:33.583575 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:33.583543 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kjxps_39530f63-5235-4a51-ade7-9af0431be21f/exporter/0.log" Apr 16 14:22:33.608902 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:33.608874 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kjxps_39530f63-5235-4a51-ade7-9af0431be21f/extractor/0.log" Apr 16 14:22:35.821405 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:35.821371 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-8xspw_75935f8c-8b9f-42fa-b199-a53244c5faff/s3-init/0.log" Apr 16 14:22:38.109398 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:38.109368 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-qlf5h" Apr 16 14:22:39.838481 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:39.838404 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-8pp5x_11464619-ce66-4292-8d50-8b67501941bf/migrator/0.log" Apr 16 14:22:39.864457 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:39.864370 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-8pp5x_11464619-ce66-4292-8d50-8b67501941bf/graceful-termination/0.log" Apr 16 14:22:40.282043 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:40.282006 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-qsrlw_e8c123a6-dc70-4989-aff2-c7374863a689/kube-storage-version-migrator-operator/1.log" Apr 16 14:22:40.283010 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:40.282987 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-qsrlw_e8c123a6-dc70-4989-aff2-c7374863a689/kube-storage-version-migrator-operator/0.log" Apr 16 14:22:41.683066 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:41.683039 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xjl2f_85abdd4c-8c23-4bf2-b9a2-a5e83b75807a/kube-multus-additional-cni-plugins/0.log" Apr 16 14:22:41.707899 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:41.707870 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xjl2f_85abdd4c-8c23-4bf2-b9a2-a5e83b75807a/egress-router-binary-copy/0.log" Apr 16 14:22:41.734504 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:41.734481 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xjl2f_85abdd4c-8c23-4bf2-b9a2-a5e83b75807a/cni-plugins/0.log" Apr 16 14:22:41.760487 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:41.760456 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xjl2f_85abdd4c-8c23-4bf2-b9a2-a5e83b75807a/bond-cni-plugin/0.log" Apr 16 14:22:41.788319 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:41.788293 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xjl2f_85abdd4c-8c23-4bf2-b9a2-a5e83b75807a/routeoverride-cni/0.log" Apr 16 14:22:41.817880 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:41.817854 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xjl2f_85abdd4c-8c23-4bf2-b9a2-a5e83b75807a/whereabouts-cni-bincopy/0.log" Apr 16 14:22:41.844340 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:41.844285 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xjl2f_85abdd4c-8c23-4bf2-b9a2-a5e83b75807a/whereabouts-cni/0.log" Apr 16 14:22:41.883676 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:41.883590 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mmczn_b3cb1a86-beca-4c98-9d1a-b08d033e57ac/kube-multus/0.log" Apr 16 14:22:41.960243 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:41.960217 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-crlsp_77d59171-3e29-4e55-a4d9-a076a67a50ce/network-metrics-daemon/0.log" Apr 16 14:22:41.983570 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:41.983541 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-crlsp_77d59171-3e29-4e55-a4d9-a076a67a50ce/kube-rbac-proxy/0.log" Apr 16 14:22:43.201551 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:43.201510 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8tl_04113a02-0dc7-42c8-a11b-4684fb794c4f/ovn-controller/0.log" Apr 16 14:22:43.241168 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:43.241138 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8tl_04113a02-0dc7-42c8-a11b-4684fb794c4f/ovn-acl-logging/0.log" Apr 16 14:22:43.281549 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:43.281521 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8tl_04113a02-0dc7-42c8-a11b-4684fb794c4f/kube-rbac-proxy-node/0.log" Apr 16 14:22:43.326480 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:43.326448 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8tl_04113a02-0dc7-42c8-a11b-4684fb794c4f/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 14:22:43.371057 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:43.371021 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8tl_04113a02-0dc7-42c8-a11b-4684fb794c4f/northd/0.log" Apr 16 14:22:43.416859 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:43.416830 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8tl_04113a02-0dc7-42c8-a11b-4684fb794c4f/nbdb/0.log" Apr 16 14:22:43.467899 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:43.467773 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8tl_04113a02-0dc7-42c8-a11b-4684fb794c4f/sbdb/0.log" Apr 16 14:22:43.596659 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:43.596626 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8tl_04113a02-0dc7-42c8-a11b-4684fb794c4f/ovnkube-controller/0.log" Apr 16 14:22:44.916961 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:44.916932 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-94g8p_6597c240-5448-4187-9f6c-4e1e6d7b7aa0/check-endpoints/0.log" Apr 16 14:22:45.006653 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:45.006622 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-96bs9_b45111e6-3682-445c-ac82-d2870a0cac78/network-check-target-container/0.log" Apr 16 14:22:46.044673 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:46.044645 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-5kvpq_0a0fcf8a-7d68-4b75-b145-75ba1622662d/iptables-alerter/0.log" Apr 16 14:22:46.835138 ip-10-0-141-131 kubenswrapper[2575]: I0416 14:22:46.835105 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-xfspm_87cef1d3-c711-4f53-a775-8b55ec1bcf86/tuned/0.log"