Apr 16 14:49:53.122588 ip-10-0-142-46 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 14:49:53.122601 ip-10-0-142-46 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 14:49:53.122609 ip-10-0-142-46 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 14:49:53.122853 ip-10-0-142-46 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 14:50:03.176516 ip-10-0-142-46 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 14:50:03.176535 ip-10-0-142-46 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot a4d23d80d8e545c08108405075a68d25 -- Apr 16 14:52:15.527485 ip-10-0-142-46 systemd[1]: Starting Kubernetes Kubelet... Apr 16 14:52:16.062272 ip-10-0-142-46 kubenswrapper[2565]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:16.062272 ip-10-0-142-46 kubenswrapper[2565]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 14:52:16.062272 ip-10-0-142-46 kubenswrapper[2565]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:16.062272 ip-10-0-142-46 kubenswrapper[2565]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 14:52:16.062272 ip-10-0-142-46 kubenswrapper[2565]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:16.064915 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.064823 2565 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 14:52:16.071974 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.071952 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:16.071974 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.071970 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:16.071974 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.071975 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:16.071974 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.071978 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:16.071974 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.071981 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:16.072177 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.071985 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:16.072177 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.071988 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:16.072177 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.071991 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:16.072177 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.071994 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:16.072177 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.071997 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:16.072177 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072001 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:16.072177 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072004 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:16.072177 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072007 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:16.072177 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072009 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:16.072177 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072012 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:16.072177 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072015 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:16.072177 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072017 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:16.072177 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072020 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:16.072177 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072023 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:16.072177 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072025 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:16.072177 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072028 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:16.072177 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072030 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:16.072177 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072033 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:16.072177 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072036 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:16.072627 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072038 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:16.072627 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072041 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:16.072627 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072043 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:16.072627 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072047 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:16.072627 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072050 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:16.072627 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072053 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:16.072627 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072055 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:16.072627 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072058 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:16.072627 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072060 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:16.072627 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072063 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:16.072627 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072065 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:16.072627 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072068 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:16.072627 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072070 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:16.072627 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072073 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:16.072627 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072077 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:16.072627 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072080 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:16.072627 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072082 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:16.072627 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072085 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:16.072627 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072088 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:16.073123 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072092 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:16.073123 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072097 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:16.073123 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072100 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:16.073123 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072103 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:16.073123 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072106 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:16.073123 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072108 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:16.073123 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072111 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:16.073123 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072114 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:16.073123 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072117 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:16.073123 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072119 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:16.073123 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072122 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:16.073123 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072125 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:16.073123 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072127 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:16.073123 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072130 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:16.073123 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072133 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:16.073123 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072135 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:16.073123 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072138 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:16.073123 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072140 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:16.073123 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072143 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:16.073607 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072146 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:16.073607 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072149 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:16.073607 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072152 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:16.073607 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072155 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:16.073607 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072158 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:16.073607 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072160 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:16.073607 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072163 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:16.073607 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072166 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:16.073607 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072184 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:16.073607 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072189 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:16.073607 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072192 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:16.073607 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072195 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:16.073607 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072197 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:16.073607 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072200 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:16.073607 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072203 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:16.073607 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072206 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:16.073607 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072210 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:16.073607 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072212 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:16.073607 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072215 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:16.073607 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072217 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:16.074090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072220 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:16.074090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072223 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:16.074090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072225 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:16.074090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072228 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:16.074090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072626 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:16.074090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072630 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:16.074090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072633 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:16.074090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072636 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:16.074090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072639 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:16.074090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072642 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:16.074090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072645 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:16.074090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072647 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:16.074090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072650 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:16.074090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072652 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:16.074090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072655 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:16.074090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072658 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:16.074090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072660 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:16.074090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072663 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:16.074090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072665 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:16.074090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072668 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:16.074592 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072672 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:16.074592 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072675 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:16.074592 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072678 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:16.074592 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072681 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:16.074592 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072684 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:16.074592 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072686 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:16.074592 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072689 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:16.074592 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072691 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:16.074592 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072694 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:16.074592 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072697 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:16.074592 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072700 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:16.074592 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072703 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:16.074592 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072705 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:16.074592 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072707 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:16.074592 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072710 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:16.074592 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072713 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:16.074592 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072715 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:16.074592 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072717 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:16.074592 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072720 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:16.074592 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072722 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:16.075087 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072725 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:16.075087 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072727 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:16.075087 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072730 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:16.075087 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072732 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:16.075087 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072736 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:16.075087 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072738 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:16.075087 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072741 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:16.075087 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072744 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:16.075087 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072747 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:16.075087 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072750 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:16.075087 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072752 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:16.075087 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072754 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:16.075087 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072758 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:16.075087 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072760 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:16.075087 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072763 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:16.075087 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072765 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:16.075087 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072768 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:16.075087 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072770 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:16.075087 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072772 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:16.075087 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072775 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:16.075632 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072777 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:16.075632 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072780 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:16.075632 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072783 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:16.075632 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072785 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:16.075632 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072788 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:16.075632 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072791 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:16.075632 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072794 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:16.075632 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072796 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:16.075632 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072799 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:16.075632 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072801 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:16.075632 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072803 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:16.075632 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072806 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:16.075632 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072810 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:16.075632 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072814 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:16.075632 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072817 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:16.075632 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072819 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:16.075632 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072822 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:16.075632 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072824 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:16.075632 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072829 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:16.076090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072833 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:16.076090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072836 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:16.076090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072839 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:16.076090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072842 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:16.076090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072844 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:16.076090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072848 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:16.076090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072850 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:16.076090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072853 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:16.076090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072855 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:16.076090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072858 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:16.076090 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.072860 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:16.076090 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074086 2565 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 14:52:16.076090 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074095 2565 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 14:52:16.076090 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074103 2565 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 14:52:16.076090 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074107 2565 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 14:52:16.076090 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074113 2565 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 14:52:16.076090 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074116 2565 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 14:52:16.076090 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074121 2565 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 14:52:16.076090 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074125 2565 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 14:52:16.076090 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074130 2565 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 14:52:16.076090 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074133 2565 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 14:52:16.076607 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074136 2565 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 14:52:16.076607 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074140 2565 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 14:52:16.076607 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074143 2565 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 14:52:16.076607 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074146 2565 flags.go:64] FLAG: --cgroup-root="" Apr 16 14:52:16.076607 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074149 2565 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 14:52:16.076607 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074152 2565 flags.go:64] FLAG: --client-ca-file="" Apr 16 14:52:16.076607 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074155 2565 flags.go:64] FLAG: --cloud-config="" Apr 16 14:52:16.076607 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074158 2565 flags.go:64] FLAG: --cloud-provider="external" Apr 16 14:52:16.076607 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074161 2565 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 14:52:16.076607 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074165 2565 flags.go:64] FLAG: --cluster-domain="" Apr 16 14:52:16.076607 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074181 2565 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 14:52:16.076607 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074184 2565 flags.go:64] FLAG: --config-dir="" Apr 16 14:52:16.076607 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074187 2565 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 14:52:16.076607 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074191 2565 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 14:52:16.076607 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074195 2565 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 14:52:16.076607 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074198 2565 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 14:52:16.076607 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074201 2565 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 14:52:16.076607 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074205 2565 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 14:52:16.076607 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074208 2565 flags.go:64] FLAG: --contention-profiling="false" Apr 16 14:52:16.076607 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074211 2565 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 14:52:16.076607 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074214 2565 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 14:52:16.076607 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074217 2565 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 14:52:16.076607 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074220 2565 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 14:52:16.076607 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074224 2565 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 14:52:16.076607 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074227 2565 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 14:52:16.077217 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074230 2565 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 14:52:16.077217 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074233 2565 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 14:52:16.077217 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074237 2565 flags.go:64] FLAG: --enable-server="true" Apr 16 14:52:16.077217 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074240 2565 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 14:52:16.077217 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074245 2565 flags.go:64] FLAG: --event-burst="100" Apr 16 14:52:16.077217 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074248 2565 flags.go:64] FLAG: --event-qps="50" Apr 16 14:52:16.077217 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074251 2565 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 14:52:16.077217 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074254 2565 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 14:52:16.077217 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074257 2565 flags.go:64] FLAG: --eviction-hard="" Apr 16 14:52:16.077217 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074261 2565 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 14:52:16.077217 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074264 2565 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 14:52:16.077217 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074267 2565 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 14:52:16.077217 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074270 2565 flags.go:64] FLAG: --eviction-soft="" Apr 16 14:52:16.077217 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074273 2565 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 14:52:16.077217 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074276 2565 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 14:52:16.077217 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074279 2565 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 14:52:16.077217 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074282 2565 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 14:52:16.077217 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074284 2565 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 14:52:16.077217 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074287 2565 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 14:52:16.077217 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074290 2565 flags.go:64] FLAG: --feature-gates="" Apr 16 14:52:16.077217 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074294 2565 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 14:52:16.077217 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074297 2565 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 14:52:16.077217 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074300 2565 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 14:52:16.077217 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074304 2565 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 14:52:16.077217 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074307 2565 flags.go:64] FLAG: --healthz-port="10248" Apr 16 14:52:16.077217 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074310 2565 flags.go:64] FLAG: --help="false" Apr 16 14:52:16.077845 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074313 2565 flags.go:64] FLAG: --hostname-override="ip-10-0-142-46.ec2.internal" Apr 16 14:52:16.077845 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074316 2565 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 14:52:16.077845 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074318 2565 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 14:52:16.077845 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074321 2565 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 14:52:16.077845 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074325 2565 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 14:52:16.077845 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074328 2565 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 14:52:16.077845 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074331 2565 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 14:52:16.077845 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074334 2565 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 14:52:16.077845 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074337 2565 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 14:52:16.077845 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074340 2565 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 14:52:16.077845 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074347 2565 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 14:52:16.077845 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074350 2565 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 14:52:16.077845 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074353 2565 flags.go:64] FLAG: --kube-reserved="" Apr 16 14:52:16.077845 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074356 2565 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 14:52:16.077845 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074359 2565 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 14:52:16.077845 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074362 2565 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 14:52:16.077845 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074365 2565 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 14:52:16.077845 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074368 2565 flags.go:64] FLAG: --lock-file="" Apr 16 14:52:16.077845 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074371 2565 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 14:52:16.077845 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074374 2565 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 14:52:16.077845 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074377 2565 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 14:52:16.077845 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074383 2565 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 14:52:16.077845 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074386 2565 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 14:52:16.078416 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074389 2565 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 14:52:16.078416 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074392 2565 flags.go:64] FLAG: --logging-format="text" Apr 16 14:52:16.078416 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074395 2565 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 14:52:16.078416 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074399 2565 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 14:52:16.078416 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074402 2565 flags.go:64] FLAG: --manifest-url="" Apr 16 14:52:16.078416 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074404 2565 flags.go:64] FLAG: --manifest-url-header="" Apr 16 14:52:16.078416 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074409 2565 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 14:52:16.078416 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074412 2565 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 14:52:16.078416 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074417 2565 flags.go:64] FLAG: --max-pods="110" Apr 16 14:52:16.078416 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074420 2565 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 14:52:16.078416 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074423 2565 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 14:52:16.078416 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074426 2565 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 14:52:16.078416 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074429 2565 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 14:52:16.078416 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074432 2565 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 14:52:16.078416 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074435 2565 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 14:52:16.078416 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074438 2565 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 14:52:16.078416 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074446 2565 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 14:52:16.078416 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074449 2565 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 14:52:16.078416 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074452 2565 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 14:52:16.078416 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074458 2565 flags.go:64] FLAG: --pod-cidr="" Apr 16 14:52:16.078416 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074462 2565 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 14:52:16.078416 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074466 2565 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 14:52:16.078416 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074470 2565 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 14:52:16.078416 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074474 2565 flags.go:64] FLAG: --pods-per-core="0" Apr 16 14:52:16.079001 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074477 2565 flags.go:64] FLAG: --port="10250" Apr 16 14:52:16.079001 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074480 2565 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 14:52:16.079001 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074483 2565 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-06a2f6c19313379dc" Apr 16 14:52:16.079001 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074486 2565 flags.go:64] FLAG: --qos-reserved="" Apr 16 14:52:16.079001 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074489 2565 flags.go:64] FLAG: --read-only-port="10255" Apr 16 14:52:16.079001 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074492 2565 flags.go:64] FLAG: --register-node="true" Apr 16 14:52:16.079001 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074495 2565 flags.go:64] FLAG: --register-schedulable="true" Apr 16 14:52:16.079001 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074498 2565 flags.go:64] FLAG: --register-with-taints="" Apr 16 14:52:16.079001 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074502 2565 flags.go:64] FLAG: --registry-burst="10" Apr 16 14:52:16.079001 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074505 2565 flags.go:64] FLAG: --registry-qps="5" Apr 16 14:52:16.079001 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074507 2565 flags.go:64] FLAG: --reserved-cpus="" Apr 16 14:52:16.079001 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074510 2565 flags.go:64] FLAG: --reserved-memory="" Apr 16 14:52:16.079001 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074514 2565 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 14:52:16.079001 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074517 2565 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 14:52:16.079001 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074520 2565 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 14:52:16.079001 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074522 2565 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 14:52:16.079001 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074525 2565 flags.go:64] FLAG: --runonce="false" Apr 16 14:52:16.079001 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074528 2565 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 14:52:16.079001 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074531 2565 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 14:52:16.079001 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074534 2565 flags.go:64] FLAG: --seccomp-default="false" Apr 16 14:52:16.079001 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074537 2565 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 14:52:16.079001 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074540 2565 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 14:52:16.079001 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074543 2565 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 14:52:16.079001 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074546 2565 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 14:52:16.079001 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074549 2565 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 14:52:16.079001 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074552 2565 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 14:52:16.079666 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074555 2565 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 14:52:16.079666 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074557 2565 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 14:52:16.079666 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074562 2565 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 14:52:16.079666 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074566 2565 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 14:52:16.079666 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074569 2565 flags.go:64] FLAG: --system-cgroups="" Apr 16 14:52:16.079666 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074571 2565 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 14:52:16.079666 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074577 2565 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 14:52:16.079666 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074580 2565 flags.go:64] FLAG: --tls-cert-file="" Apr 16 14:52:16.079666 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074583 2565 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 14:52:16.079666 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074587 2565 flags.go:64] FLAG: --tls-min-version="" Apr 16 14:52:16.079666 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074590 2565 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 14:52:16.079666 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074592 2565 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 14:52:16.079666 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074595 2565 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 14:52:16.079666 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074598 2565 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 14:52:16.079666 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074602 2565 flags.go:64] FLAG: --v="2" Apr 16 14:52:16.079666 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074606 2565 flags.go:64] FLAG: --version="false" Apr 16 14:52:16.079666 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074610 2565 flags.go:64] FLAG: --vmodule="" Apr 16 14:52:16.079666 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074614 2565 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 14:52:16.079666 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.074617 2565 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 14:52:16.079666 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074715 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:16.079666 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074719 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:16.079666 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074722 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:16.079666 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074726 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:16.079666 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074728 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:16.080253 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074731 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:16.080253 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074734 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:16.080253 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074736 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:16.080253 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074739 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:16.080253 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074741 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:16.080253 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074744 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:16.080253 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074746 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:16.080253 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074749 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:16.080253 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074751 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:16.080253 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074754 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:16.080253 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074758 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:16.080253 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074761 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:16.080253 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074764 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:16.080253 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074773 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:16.080253 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074778 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:16.080253 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074781 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:16.080253 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074783 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:16.080253 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074786 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:16.080253 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074790 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:16.080771 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074794 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:16.080771 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074797 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:16.080771 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074800 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:16.080771 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074802 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:16.080771 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074805 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:16.080771 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074807 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:16.080771 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074810 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:16.080771 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074812 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:16.080771 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074815 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:16.080771 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074817 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:16.080771 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074820 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:16.080771 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074823 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:16.080771 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074825 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:16.080771 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074828 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:16.080771 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074831 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:16.080771 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074834 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:16.080771 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074836 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:16.080771 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074838 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:16.080771 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074841 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:16.080771 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074843 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:16.081268 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074846 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:16.081268 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074849 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:16.081268 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074851 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:16.081268 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074855 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:16.081268 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074857 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:16.081268 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074861 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:16.081268 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074863 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:16.081268 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074866 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:16.081268 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074868 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:16.081268 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074871 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:16.081268 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074874 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:16.081268 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074876 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:16.081268 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074879 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:16.081268 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074881 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:16.081268 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074884 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:16.081268 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074886 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:16.081268 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074889 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:16.081268 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074892 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:16.081268 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074894 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:16.081268 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074896 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:16.081748 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074899 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:16.081748 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074901 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:16.081748 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074904 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:16.081748 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074907 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:16.081748 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074910 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:16.081748 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074912 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:16.081748 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074914 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:16.081748 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074917 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:16.081748 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074919 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:16.081748 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074922 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:16.081748 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074924 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:16.081748 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074926 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:16.081748 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074929 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:16.081748 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074931 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:16.081748 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074934 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:16.081748 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074938 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:16.081748 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074940 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:16.081748 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074943 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:16.081748 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074946 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:16.081748 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074948 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:16.082266 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074951 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:16.082266 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.074953 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:16.082266 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.075862 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:16.084136 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.084114 2565 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 14:52:16.084187 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.084137 2565 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 14:52:16.084220 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084199 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:16.084220 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084205 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:16.084220 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084209 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:16.084220 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084212 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:16.084220 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084215 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:16.084220 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084218 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:16.084220 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084221 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:16.084220 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084224 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:16.084418 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084227 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:16.084418 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084230 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:16.084418 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084233 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:16.084418 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084236 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:16.084418 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084238 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:16.084418 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084241 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:16.084418 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084243 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:16.084418 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084246 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:16.084418 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084249 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:16.084418 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084251 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:16.084418 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084254 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:16.084418 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084256 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:16.084418 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084259 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:16.084418 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084262 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:16.084418 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084264 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:16.084418 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084267 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:16.084418 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084269 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:16.084418 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084273 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:16.084418 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084275 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:16.084890 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084278 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:16.084890 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084281 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:16.084890 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084283 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:16.084890 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084286 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:16.084890 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084288 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:16.084890 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084291 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:16.084890 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084294 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:16.084890 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084296 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:16.084890 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084299 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:16.084890 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084302 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:16.084890 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084304 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:16.084890 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084307 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:16.084890 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084309 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:16.084890 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084311 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:16.084890 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084315 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:16.084890 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084318 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:16.084890 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084320 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:16.084890 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084323 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:16.084890 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084325 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:16.084890 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084328 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:16.085441 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084331 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:16.085441 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084333 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:16.085441 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084336 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:16.085441 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084338 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:16.085441 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084341 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:16.085441 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084345 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:16.085441 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084349 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:16.085441 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084352 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:16.085441 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084355 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:16.085441 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084358 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:16.085441 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084360 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:16.085441 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084363 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:16.085441 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084367 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:16.085441 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084370 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:16.085441 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084373 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:16.085441 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084375 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:16.085441 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084378 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:16.085441 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084380 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:16.085441 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084383 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:16.085909 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084385 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:16.085909 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084388 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:16.085909 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084390 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:16.085909 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084393 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:16.085909 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084395 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:16.085909 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084399 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:16.085909 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084403 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:16.085909 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084406 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:16.085909 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084410 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:16.085909 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084413 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:16.085909 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084416 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:16.085909 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084419 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:16.085909 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084422 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:16.085909 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084425 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:16.085909 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084428 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:16.085909 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084431 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:16.085909 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084434 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:16.085909 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084436 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:16.085909 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084439 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:16.085909 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084441 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:16.086440 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.084446 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:16.086440 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084549 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:16.086440 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084554 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:16.086440 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084557 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:16.086440 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084560 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:16.086440 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084563 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:16.086440 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084566 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:16.086440 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084569 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:16.086440 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084572 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:16.086440 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084574 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:16.086440 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084577 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:16.086440 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084579 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:16.086440 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084582 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:16.086440 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084585 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:16.086440 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084587 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:16.086832 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084590 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:16.086832 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084592 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:16.086832 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084595 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:16.086832 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084597 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:16.086832 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084601 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:16.086832 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084604 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:16.086832 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084607 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:16.086832 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084610 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:16.086832 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084613 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:16.086832 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084616 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:16.086832 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084619 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:16.086832 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084621 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:16.086832 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084624 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:16.086832 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084626 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:16.086832 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084629 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:16.086832 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084631 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:16.086832 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084634 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:16.086832 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084636 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:16.086832 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084639 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:16.087314 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084642 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:16.087314 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084644 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:16.087314 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084647 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:16.087314 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084649 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:16.087314 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084652 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:16.087314 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084655 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:16.087314 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084658 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:16.087314 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084660 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:16.087314 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084663 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:16.087314 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084665 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:16.087314 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084668 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:16.087314 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084671 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:16.087314 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084673 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:16.087314 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084675 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:16.087314 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084678 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:16.087314 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084680 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:16.087314 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084683 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:16.087314 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084686 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:16.087314 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084689 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:16.087314 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084692 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:16.087802 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084694 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:16.087802 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084697 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:16.087802 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084700 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:16.087802 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084703 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:16.087802 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084705 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:16.087802 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084708 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:16.087802 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084712 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:16.087802 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084715 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:16.087802 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084719 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:16.087802 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084722 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:16.087802 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084725 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:16.087802 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084728 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:16.087802 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084730 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:16.087802 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084733 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:16.087802 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084735 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:16.087802 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084738 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:16.087802 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084741 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:16.087802 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084744 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:16.087802 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084746 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:16.088309 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084749 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:16.088309 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084751 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:16.088309 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084754 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:16.088309 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084756 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:16.088309 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084759 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:16.088309 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084761 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:16.088309 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084764 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:16.088309 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084766 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:16.088309 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084769 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:16.088309 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084772 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:16.088309 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084775 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:16.088309 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084778 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:16.088309 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084781 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:16.088309 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:16.084783 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:16.088309 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.084788 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:16.088684 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.085582 2565 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 14:52:16.088864 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.088850 2565 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 14:52:16.090236 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.090225 2565 server.go:1019] "Starting client certificate rotation" Apr 16 14:52:16.090341 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.090326 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:52:16.090376 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.090367 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:52:16.122025 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.122003 2565 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:52:16.125237 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.125218 2565 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:52:16.143828 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.143803 2565 log.go:25] "Validated CRI v1 runtime API" Apr 16 14:52:16.149744 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.149727 2565 log.go:25] "Validated CRI v1 image API" Apr 16 14:52:16.154133 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.154114 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:52:16.154497 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.154478 2565 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 14:52:16.158071 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.158050 2565 fs.go:135] Filesystem UUIDs: map[37f9f47c-f39d-4de5-9aab-a93d1c493bfd:/dev/nvme0n1p3 42da34f8-ea3f-4366-9e7b-16b1092cfa86:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 16 14:52:16.158146 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.158070 2565 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 14:52:16.164069 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.163958 2565 manager.go:217] Machine: {Timestamp:2026-04-16 14:52:16.162624386 +0000 UTC m=+0.503755457 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099818 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec26309e4fdaf0c2f9a47a0fed096157 SystemUUID:ec26309e-4fda-f0c2-f9a4-7a0fed096157 BootID:a4d23d80-d8e5-45c0-8108-405075a68d25 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c7:0c:01:93:93 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c7:0c:01:93:93 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:f6:1d:a2:18:fe:34 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 14:52:16.164069 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.164066 2565 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 14:52:16.164188 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.164145 2565 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 14:52:16.164556 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.164534 2565 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 14:52:16.164689 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.164558 2565 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-46.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 14:52:16.164730 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.164699 2565 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 14:52:16.164730 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.164708 2565 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 14:52:16.164730 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.164722 2565 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:52:16.164810 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.164740 2565 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:52:16.166297 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.166286 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:52:16.166401 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.166392 2565 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 14:52:16.168554 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.168537 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gg54x" Apr 16 14:52:16.169073 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.169063 2565 kubelet.go:491] "Attempting to sync node with API server" Apr 16 14:52:16.169147 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.169078 2565 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 14:52:16.169147 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.169089 2565 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 14:52:16.169147 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.169100 2565 kubelet.go:397] "Adding apiserver pod source" Apr 16 14:52:16.169147 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.169118 2565 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 14:52:16.170488 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.170477 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:52:16.170530 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.170495 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:52:16.173767 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.173748 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gg54x" Apr 16 14:52:16.174730 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.174710 2565 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 14:52:16.176721 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.176703 2565 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 14:52:16.178124 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.178112 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 14:52:16.178194 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.178129 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 14:52:16.178194 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.178136 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 14:52:16.178194 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.178141 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 14:52:16.178194 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.178147 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 14:52:16.178194 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.178153 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 14:52:16.178194 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.178159 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 14:52:16.178194 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.178165 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 14:52:16.178194 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.178184 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 14:52:16.178194 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.178190 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 14:52:16.178447 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.178208 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 14:52:16.178447 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.178217 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 14:52:16.179164 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.179155 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 14:52:16.179164 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.179164 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 14:52:16.182896 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.182880 2565 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 14:52:16.182965 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.182917 2565 server.go:1295] "Started kubelet" Apr 16 14:52:16.184840 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.183020 2565 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 14:52:16.184907 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.184864 2565 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 14:52:16.185506 ip-10-0-142-46 systemd[1]: Started Kubernetes Kubelet. Apr 16 14:52:16.186324 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.186308 2565 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:16.187751 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.187105 2565 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 14:52:16.188155 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.188137 2565 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 14:52:16.189629 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.189610 2565 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-142-46.ec2.internal" not found Apr 16 14:52:16.192408 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.192390 2565 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:16.193015 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.193001 2565 server.go:317] "Adding debug handlers to kubelet server" Apr 16 14:52:16.197566 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:16.197547 2565 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 14:52:16.198720 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.198707 2565 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 14:52:16.198720 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.198713 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 14:52:16.199450 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.199399 2565 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 14:52:16.199538 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.199506 2565 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 14:52:16.199538 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.199530 2565 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 14:52:16.199862 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.199654 2565 reconstruct.go:97] "Volume reconstruction finished" Apr 16 14:52:16.199862 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.199663 2565 reconciler.go:26] "Reconciler: start to sync state" Apr 16 14:52:16.199862 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:16.199712 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-46.ec2.internal\" not found" Apr 16 14:52:16.199862 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.199787 2565 factory.go:153] Registering CRI-O factory Apr 16 14:52:16.199862 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.199814 2565 factory.go:223] Registration of the crio container factory successfully Apr 16 14:52:16.200071 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.199891 2565 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 14:52:16.200071 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.199903 2565 factory.go:55] Registering systemd factory Apr 16 14:52:16.200071 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.199911 2565 factory.go:223] Registration of the systemd container factory successfully Apr 16 14:52:16.200071 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.199933 2565 factory.go:103] Registering Raw factory Apr 16 14:52:16.200071 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.199948 2565 manager.go:1196] Started watching for new ooms in manager Apr 16 14:52:16.200595 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.200578 2565 manager.go:319] Starting recovery of all containers Apr 16 14:52:16.200793 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.200772 2565 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:16.206651 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.206632 2565 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-142-46.ec2.internal" not found Apr 16 14:52:16.206753 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:16.206712 2565 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-142-46.ec2.internal\" not found" node="ip-10-0-142-46.ec2.internal" Apr 16 14:52:16.210762 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.210654 2565 manager.go:324] Recovery completed Apr 16 14:52:16.214834 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.214822 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:16.216750 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.216736 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-46.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:16.216797 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.216766 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-46.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:16.216797 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.216780 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-46.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:16.217624 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.217605 2565 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 14:52:16.217624 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.217623 2565 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 14:52:16.217755 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.217643 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:52:16.220526 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.220511 2565 policy_none.go:49] "None policy: Start" Apr 16 14:52:16.220589 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.220535 2565 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 14:52:16.221076 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.221066 2565 state_mem.go:35] "Initializing new in-memory state store" Apr 16 14:52:16.261548 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.261524 2565 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-142-46.ec2.internal" not found Apr 16 14:52:16.265953 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.261801 2565 manager.go:341] "Starting Device Plugin manager" Apr 16 14:52:16.265953 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:16.261836 2565 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 14:52:16.265953 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.261846 2565 server.go:85] "Starting device plugin registration server" Apr 16 14:52:16.265953 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.262088 2565 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 14:52:16.265953 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.262099 2565 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 14:52:16.265953 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.262214 2565 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 14:52:16.265953 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.262286 2565 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 14:52:16.265953 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.262293 2565 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 14:52:16.265953 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:16.262861 2565 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 14:52:16.265953 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:16.262899 2565 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-46.ec2.internal\" not found" Apr 16 14:52:16.320885 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.320799 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 14:52:16.322051 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.322034 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 14:52:16.322134 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.322064 2565 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 14:52:16.322134 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.322091 2565 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 14:52:16.322134 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.322102 2565 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 14:52:16.322281 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:16.322146 2565 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 14:52:16.324926 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.324908 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:16.362578 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.362553 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:16.363596 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.363578 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-46.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:16.363697 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.363607 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-46.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:16.363697 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.363619 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-46.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:16.363697 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.363644 2565 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-46.ec2.internal" Apr 16 14:52:16.372508 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.372487 2565 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-46.ec2.internal" Apr 16 14:52:16.372588 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:16.372515 2565 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-46.ec2.internal\": node \"ip-10-0-142-46.ec2.internal\" not found" Apr 16 14:52:16.422909 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.422862 2565 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-46.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-142-46.ec2.internal"] Apr 16 14:52:16.426157 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.426134 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-46.ec2.internal" Apr 16 14:52:16.426293 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.426136 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-46.ec2.internal" Apr 16 14:52:16.444686 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.444668 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-46.ec2.internal" Apr 16 14:52:16.448669 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.448656 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-46.ec2.internal" Apr 16 14:52:16.459656 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.459635 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:52:16.459732 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.459636 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:52:16.501031 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.501000 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4c16a30a2810a56bab96c6b9f99d7cfc-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-46.ec2.internal\" (UID: \"4c16a30a2810a56bab96c6b9f99d7cfc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-46.ec2.internal" Apr 16 14:52:16.501031 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.501031 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c16a30a2810a56bab96c6b9f99d7cfc-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-46.ec2.internal\" (UID: \"4c16a30a2810a56bab96c6b9f99d7cfc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-46.ec2.internal" Apr 16 14:52:16.501235 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.501050 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/540800c0848b372f11baec71c30048e7-config\") pod \"kube-apiserver-proxy-ip-10-0-142-46.ec2.internal\" (UID: \"540800c0848b372f11baec71c30048e7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-46.ec2.internal" Apr 16 14:52:16.601886 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.601820 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4c16a30a2810a56bab96c6b9f99d7cfc-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-46.ec2.internal\" (UID: \"4c16a30a2810a56bab96c6b9f99d7cfc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-46.ec2.internal" Apr 16 14:52:16.601886 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.601849 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c16a30a2810a56bab96c6b9f99d7cfc-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-46.ec2.internal\" (UID: \"4c16a30a2810a56bab96c6b9f99d7cfc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-46.ec2.internal" Apr 16 14:52:16.601886 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.601869 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/540800c0848b372f11baec71c30048e7-config\") pod \"kube-apiserver-proxy-ip-10-0-142-46.ec2.internal\" (UID: \"540800c0848b372f11baec71c30048e7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-46.ec2.internal" Apr 16 14:52:16.602070 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.601925 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/540800c0848b372f11baec71c30048e7-config\") pod \"kube-apiserver-proxy-ip-10-0-142-46.ec2.internal\" (UID: \"540800c0848b372f11baec71c30048e7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-46.ec2.internal" Apr 16 14:52:16.602070 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.601937 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c16a30a2810a56bab96c6b9f99d7cfc-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-46.ec2.internal\" (UID: \"4c16a30a2810a56bab96c6b9f99d7cfc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-46.ec2.internal" Apr 16 14:52:16.602070 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.601966 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4c16a30a2810a56bab96c6b9f99d7cfc-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-46.ec2.internal\" (UID: \"4c16a30a2810a56bab96c6b9f99d7cfc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-46.ec2.internal" Apr 16 14:52:16.762068 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.762029 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-46.ec2.internal" Apr 16 14:52:16.765741 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:16.765724 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-46.ec2.internal" Apr 16 14:52:17.089619 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.089589 2565 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 14:52:17.090254 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.089787 2565 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 14:52:17.090254 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.089797 2565 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 14:52:17.090254 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.089787 2565 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 14:52:17.169693 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.169664 2565 apiserver.go:52] "Watching apiserver" Apr 16 14:52:17.175035 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.175015 2565 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 14:52:17.175419 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.175398 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-jf55j","openshift-multus/network-metrics-daemon-j76vn","openshift-ovn-kubernetes/ovnkube-node-t4krl","kube-system/konnectivity-agent-ldwhr","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr","openshift-cluster-node-tuning-operator/tuned-cj56j","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-46.ec2.internal","openshift-network-diagnostics/network-check-target-d7tkp","openshift-network-operator/iptables-alerter-5zt6p","kube-system/kube-apiserver-proxy-ip-10-0-142-46.ec2.internal","openshift-dns/node-resolver-f4596","openshift-image-registry/node-ca-fpnf7","openshift-multus/multus-additional-cni-plugins-4mtb6"] Apr 16 14:52:17.175865 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.175840 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 14:47:16 +0000 UTC" deadline="2028-01-25 07:26:59.492481686 +0000 UTC" Apr 16 14:52:17.175865 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.175863 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15568h34m42.316621472s" Apr 16 14:52:17.177216 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.177192 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.179663 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.179648 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 14:52:17.179819 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.179801 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 14:52:17.179927 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.179912 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-tc7xh\"" Apr 16 14:52:17.180252 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.180147 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 14:52:17.180252 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.180181 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 14:52:17.181427 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.181411 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:17.181512 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:17.181478 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j76vn" podUID="38d86a56-d8b6-4bb2-a413-3166ca14717f" Apr 16 14:52:17.182902 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.182885 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.184195 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.184179 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ldwhr" Apr 16 14:52:17.184649 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.184631 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 14:52:17.184724 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.184675 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 14:52:17.184940 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.184925 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 14:52:17.184990 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.184943 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 14:52:17.184990 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.184931 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-fpdhf\"" Apr 16 14:52:17.185075 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.185063 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 14:52:17.185225 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.185207 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 14:52:17.185446 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.185428 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" Apr 16 14:52:17.186164 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.186151 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-z8nh9\"" Apr 16 14:52:17.186290 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.186152 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 14:52:17.186381 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.186367 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 14:52:17.186778 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.186756 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.187355 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.187342 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 14:52:17.187427 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.187355 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-8cxk6\"" Apr 16 14:52:17.187511 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.187498 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 14:52:17.187619 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.187603 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 14:52:17.188255 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.188241 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:52:17.188319 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:17.188302 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7tkp" podUID="82d43552-0266-40be-b011-548c6b1da18a" Apr 16 14:52:17.188598 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.188584 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-58s96\"" Apr 16 14:52:17.188636 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.188608 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 14:52:17.188636 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.188629 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:52:17.189613 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.189598 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5zt6p" Apr 16 14:52:17.191069 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.191054 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-f4596" Apr 16 14:52:17.191653 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.191626 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 14:52:17.191749 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.191709 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 14:52:17.191749 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.191728 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-lvsxn\"" Apr 16 14:52:17.191890 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.191814 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:52:17.192504 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.192487 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fpnf7" Apr 16 14:52:17.192982 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.192962 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 14:52:17.193064 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.193020 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-cj8gs\"" Apr 16 14:52:17.193346 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.193328 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 14:52:17.194145 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.194120 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4mtb6" Apr 16 14:52:17.195092 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.194642 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 14:52:17.195222 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.195110 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 14:52:17.199557 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.198219 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-847j8\"" Apr 16 14:52:17.199557 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.198377 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 14:52:17.199557 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.198480 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vgtz9\"" Apr 16 14:52:17.199557 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.198611 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 14:52:17.199557 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.198745 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 14:52:17.199557 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.198873 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 14:52:17.201044 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.201027 2565 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 14:52:17.205989 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.205969 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-host-run-ovn-kubernetes\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.206080 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206003 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/84ace2a9-8bcc-47b5-81bb-c764aa280104-ovnkube-script-lib\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.206080 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206039 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pcz9\" (UniqueName: \"kubernetes.io/projected/84ace2a9-8bcc-47b5-81bb-c764aa280104-kube-api-access-7pcz9\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.206080 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206063 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8p2r\" (UniqueName: \"kubernetes.io/projected/fd96140b-7f7e-4208-9c8d-400e5a881b11-kube-api-access-v8p2r\") pod \"iptables-alerter-5zt6p\" (UID: \"fd96140b-7f7e-4208-9c8d-400e5a881b11\") " pod="openshift-network-operator/iptables-alerter-5zt6p" Apr 16 14:52:17.206219 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206089 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cdb087ca-e5b6-43aa-88b4-f2d25147cf7e-tmp-dir\") pod \"node-resolver-f4596\" (UID: \"cdb087ca-e5b6-43aa-88b4-f2d25147cf7e\") " pod="openshift-dns/node-resolver-f4596" Apr 16 14:52:17.206219 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206112 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/40f100bd-4ea9-4c4c-bfe7-d00fbe4359f5-host\") pod \"node-ca-fpnf7\" (UID: \"40f100bd-4ea9-4c4c-bfe7-d00fbe4359f5\") " pod="openshift-image-registry/node-ca-fpnf7" Apr 16 14:52:17.206302 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206238 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-run-systemd\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.206302 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206264 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m56t\" (UniqueName: \"kubernetes.io/projected/5172b522-bc83-41f2-8760-e2fba5340ff1-kube-api-access-8m56t\") pod \"multus-additional-cni-plugins-4mtb6\" (UID: \"5172b522-bc83-41f2-8760-e2fba5340ff1\") " pod="openshift-multus/multus-additional-cni-plugins-4mtb6" Apr 16 14:52:17.206302 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206288 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-etc-sysctl-conf\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.206422 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206311 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-var-lib-kubelet\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.206422 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206369 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fd96140b-7f7e-4208-9c8d-400e5a881b11-iptables-alerter-script\") pod \"iptables-alerter-5zt6p\" (UID: \"fd96140b-7f7e-4208-9c8d-400e5a881b11\") " pod="openshift-network-operator/iptables-alerter-5zt6p" Apr 16 14:52:17.206422 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206393 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-run-ovn\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.206422 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206417 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/233701c2-920c-46da-8d99-9ee0fe62c01a-agent-certs\") pod \"konnectivity-agent-ldwhr\" (UID: \"233701c2-920c-46da-8d99-9ee0fe62c01a\") " pod="kube-system/konnectivity-agent-ldwhr" Apr 16 14:52:17.206569 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206441 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5172b522-bc83-41f2-8760-e2fba5340ff1-system-cni-dir\") pod \"multus-additional-cni-plugins-4mtb6\" (UID: \"5172b522-bc83-41f2-8760-e2fba5340ff1\") " pod="openshift-multus/multus-additional-cni-plugins-4mtb6" Apr 16 14:52:17.206569 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206474 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-cnibin\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.206569 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206497 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-etc-kubernetes\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.206569 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206518 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs\") pod \"network-metrics-daemon-j76vn\" (UID: \"38d86a56-d8b6-4bb2-a413-3166ca14717f\") " pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:17.206736 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206569 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-log-socket\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.206736 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206610 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d508526-4ec1-4ecd-be56-0426bc2e4469-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pbdzr\" (UID: \"4d508526-4ec1-4ecd-be56-0426bc2e4469\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" Apr 16 14:52:17.206736 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206631 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-lib-modules\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.206736 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206646 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-host-kubelet\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.206736 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206660 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-etc-openvswitch\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.206736 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206674 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-etc-kubernetes\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.206736 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206701 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-run\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.207090 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206765 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-host-var-lib-kubelet\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.207090 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206814 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-host-run-netns\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.207090 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206842 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5172b522-bc83-41f2-8760-e2fba5340ff1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4mtb6\" (UID: \"5172b522-bc83-41f2-8760-e2fba5340ff1\") " pod="openshift-multus/multus-additional-cni-plugins-4mtb6" Apr 16 14:52:17.207090 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206883 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-system-cni-dir\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.207090 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206906 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c578c137-c4e7-4fd5-8394-2a72b0661d12-etc-tuned\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.207090 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206931 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g4cd\" (UniqueName: \"kubernetes.io/projected/c578c137-c4e7-4fd5-8394-2a72b0661d12-kube-api-access-2g4cd\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.207090 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206955 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr4mk\" (UniqueName: \"kubernetes.io/projected/cdb087ca-e5b6-43aa-88b4-f2d25147cf7e-kube-api-access-cr4mk\") pod \"node-resolver-f4596\" (UID: \"cdb087ca-e5b6-43aa-88b4-f2d25147cf7e\") " pod="openshift-dns/node-resolver-f4596" Apr 16 14:52:17.207090 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.206980 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/84ace2a9-8bcc-47b5-81bb-c764aa280104-ovn-node-metrics-cert\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.207090 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207004 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/233701c2-920c-46da-8d99-9ee0fe62c01a-konnectivity-ca\") pod \"konnectivity-agent-ldwhr\" (UID: \"233701c2-920c-46da-8d99-9ee0fe62c01a\") " pod="kube-system/konnectivity-agent-ldwhr" Apr 16 14:52:17.207090 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207029 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4d508526-4ec1-4ecd-be56-0426bc2e4469-registration-dir\") pod \"aws-ebs-csi-driver-node-pbdzr\" (UID: \"4d508526-4ec1-4ecd-be56-0426bc2e4469\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" Apr 16 14:52:17.207090 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207061 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5172b522-bc83-41f2-8760-e2fba5340ff1-os-release\") pod \"multus-additional-cni-plugins-4mtb6\" (UID: \"5172b522-bc83-41f2-8760-e2fba5340ff1\") " pod="openshift-multus/multus-additional-cni-plugins-4mtb6" Apr 16 14:52:17.207090 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207086 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-cni-binary-copy\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.207647 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207160 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-multus-socket-dir-parent\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.207647 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207203 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-host-run-multus-certs\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.207647 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207245 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4d508526-4ec1-4ecd-be56-0426bc2e4469-sys-fs\") pod \"aws-ebs-csi-driver-node-pbdzr\" (UID: \"4d508526-4ec1-4ecd-be56-0426bc2e4469\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" Apr 16 14:52:17.207647 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207272 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5172b522-bc83-41f2-8760-e2fba5340ff1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4mtb6\" (UID: \"5172b522-bc83-41f2-8760-e2fba5340ff1\") " pod="openshift-multus/multus-additional-cni-plugins-4mtb6" Apr 16 14:52:17.207647 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207345 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5172b522-bc83-41f2-8760-e2fba5340ff1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4mtb6\" (UID: \"5172b522-bc83-41f2-8760-e2fba5340ff1\") " pod="openshift-multus/multus-additional-cni-plugins-4mtb6" Apr 16 14:52:17.207647 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207376 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-host-var-lib-cni-multus\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.207647 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207407 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-os-release\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.207647 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207441 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-run-openvswitch\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.207647 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207466 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfdsn\" (UniqueName: \"kubernetes.io/projected/38d86a56-d8b6-4bb2-a413-3166ca14717f-kube-api-access-cfdsn\") pod \"network-metrics-daemon-j76vn\" (UID: \"38d86a56-d8b6-4bb2-a413-3166ca14717f\") " pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:17.207647 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207496 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-host-slash\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.207647 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207519 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-multus-conf-dir\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.207647 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207542 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-etc-sysctl-d\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.207647 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207572 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-host-run-netns\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.207647 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207607 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-host-cni-bin\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.207647 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207638 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4d508526-4ec1-4ecd-be56-0426bc2e4469-device-dir\") pod \"aws-ebs-csi-driver-node-pbdzr\" (UID: \"4d508526-4ec1-4ecd-be56-0426bc2e4469\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" Apr 16 14:52:17.208189 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207655 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c578c137-c4e7-4fd5-8394-2a72b0661d12-tmp\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.208189 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207670 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fd96140b-7f7e-4208-9c8d-400e5a881b11-host-slash\") pod \"iptables-alerter-5zt6p\" (UID: \"fd96140b-7f7e-4208-9c8d-400e5a881b11\") " pod="openshift-network-operator/iptables-alerter-5zt6p" Apr 16 14:52:17.208189 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207686 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cdb087ca-e5b6-43aa-88b4-f2d25147cf7e-hosts-file\") pod \"node-resolver-f4596\" (UID: \"cdb087ca-e5b6-43aa-88b4-f2d25147cf7e\") " pod="openshift-dns/node-resolver-f4596" Apr 16 14:52:17.208189 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207699 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6v76\" (UniqueName: \"kubernetes.io/projected/40f100bd-4ea9-4c4c-bfe7-d00fbe4359f5-kube-api-access-s6v76\") pod \"node-ca-fpnf7\" (UID: \"40f100bd-4ea9-4c4c-bfe7-d00fbe4359f5\") " pod="openshift-image-registry/node-ca-fpnf7" Apr 16 14:52:17.208189 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207750 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/84ace2a9-8bcc-47b5-81bb-c764aa280104-env-overrides\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.208189 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207776 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4d508526-4ec1-4ecd-be56-0426bc2e4469-socket-dir\") pod \"aws-ebs-csi-driver-node-pbdzr\" (UID: \"4d508526-4ec1-4ecd-be56-0426bc2e4469\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" Apr 16 14:52:17.208189 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207792 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b88ww\" (UniqueName: \"kubernetes.io/projected/4d508526-4ec1-4ecd-be56-0426bc2e4469-kube-api-access-b88ww\") pod \"aws-ebs-csi-driver-node-pbdzr\" (UID: \"4d508526-4ec1-4ecd-be56-0426bc2e4469\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" Apr 16 14:52:17.208189 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207814 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpsk8\" (UniqueName: \"kubernetes.io/projected/82d43552-0266-40be-b011-548c6b1da18a-kube-api-access-xpsk8\") pod \"network-check-target-d7tkp\" (UID: \"82d43552-0266-40be-b011-548c6b1da18a\") " pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:52:17.208189 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207829 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-var-lib-openvswitch\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.208189 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207849 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-node-log\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.208189 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207863 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-multus-daemon-config\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.208189 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207886 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-multus-cni-dir\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.208189 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207900 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-host-var-lib-cni-bin\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.208189 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207921 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.208189 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207938 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-host-run-k8s-cni-cncf-io\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.208189 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207953 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-host-cni-netd\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.208834 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207976 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/84ace2a9-8bcc-47b5-81bb-c764aa280104-ovnkube-config\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.208834 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.207994 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4d508526-4ec1-4ecd-be56-0426bc2e4469-etc-selinux\") pod \"aws-ebs-csi-driver-node-pbdzr\" (UID: \"4d508526-4ec1-4ecd-be56-0426bc2e4469\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" Apr 16 14:52:17.208834 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.208016 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5172b522-bc83-41f2-8760-e2fba5340ff1-cnibin\") pod \"multus-additional-cni-plugins-4mtb6\" (UID: \"5172b522-bc83-41f2-8760-e2fba5340ff1\") " pod="openshift-multus/multus-additional-cni-plugins-4mtb6" Apr 16 14:52:17.208834 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.208029 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cbrc\" (UniqueName: \"kubernetes.io/projected/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-kube-api-access-5cbrc\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.208834 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.208048 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-etc-modprobe-d\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.208834 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.208068 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-etc-sysconfig\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.208834 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.208081 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-etc-systemd\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.208834 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.208098 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-hostroot\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.208834 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.208115 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-systemd-units\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.208834 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.208141 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5172b522-bc83-41f2-8760-e2fba5340ff1-cni-binary-copy\") pod \"multus-additional-cni-plugins-4mtb6\" (UID: \"5172b522-bc83-41f2-8760-e2fba5340ff1\") " pod="openshift-multus/multus-additional-cni-plugins-4mtb6" Apr 16 14:52:17.208834 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.208202 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-sys\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.208834 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.208246 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-host\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.208834 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.208317 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/40f100bd-4ea9-4c4c-bfe7-d00fbe4359f5-serviceca\") pod \"node-ca-fpnf7\" (UID: \"40f100bd-4ea9-4c4c-bfe7-d00fbe4359f5\") " pod="openshift-image-registry/node-ca-fpnf7" Apr 16 14:52:17.208834 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.208510 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:52:17.230271 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.230250 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-gktk6" Apr 16 14:52:17.235310 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.235292 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-gktk6" Apr 16 14:52:17.287893 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:17.287851 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c16a30a2810a56bab96c6b9f99d7cfc.slice/crio-b2a1c4bc7209fd24f0d1e24492a9f216dc8fff5f34ef5294fe3df90eb8695948 WatchSource:0}: Error finding container b2a1c4bc7209fd24f0d1e24492a9f216dc8fff5f34ef5294fe3df90eb8695948: Status 404 returned error can't find the container with id b2a1c4bc7209fd24f0d1e24492a9f216dc8fff5f34ef5294fe3df90eb8695948 Apr 16 14:52:17.288120 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:17.288089 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod540800c0848b372f11baec71c30048e7.slice/crio-c62b093e667193f435c42fb9e32273e3488fd4be6d964745e5cdbaf65ff0d920 WatchSource:0}: Error finding container c62b093e667193f435c42fb9e32273e3488fd4be6d964745e5cdbaf65ff0d920: Status 404 returned error can't find the container with id c62b093e667193f435c42fb9e32273e3488fd4be6d964745e5cdbaf65ff0d920 Apr 16 14:52:17.293783 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.293768 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:52:17.308764 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.308710 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpsk8\" (UniqueName: \"kubernetes.io/projected/82d43552-0266-40be-b011-548c6b1da18a-kube-api-access-xpsk8\") pod \"network-check-target-d7tkp\" (UID: \"82d43552-0266-40be-b011-548c6b1da18a\") " pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:52:17.308874 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.308786 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-var-lib-openvswitch\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.308874 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.308816 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-node-log\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.308874 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.308845 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-multus-daemon-config\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.309022 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.308884 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-multus-cni-dir\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.309022 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.308898 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-var-lib-openvswitch\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.309022 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.308911 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-host-var-lib-cni-bin\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.309022 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.308954 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.309022 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.308984 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-host-run-k8s-cni-cncf-io\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.309022 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.308993 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-node-log\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.309022 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.308958 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-host-var-lib-cni-bin\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.309386 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309008 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-host-cni-netd\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.309386 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309040 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-host-cni-netd\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.309386 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309059 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.309386 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309068 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/84ace2a9-8bcc-47b5-81bb-c764aa280104-ovnkube-config\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.309386 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309078 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-host-run-k8s-cni-cncf-io\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.309386 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309098 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4d508526-4ec1-4ecd-be56-0426bc2e4469-etc-selinux\") pod \"aws-ebs-csi-driver-node-pbdzr\" (UID: \"4d508526-4ec1-4ecd-be56-0426bc2e4469\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" Apr 16 14:52:17.309386 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309116 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-multus-cni-dir\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.309386 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309124 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5172b522-bc83-41f2-8760-e2fba5340ff1-cnibin\") pod \"multus-additional-cni-plugins-4mtb6\" (UID: \"5172b522-bc83-41f2-8760-e2fba5340ff1\") " pod="openshift-multus/multus-additional-cni-plugins-4mtb6" Apr 16 14:52:17.309386 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309185 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5cbrc\" (UniqueName: \"kubernetes.io/projected/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-kube-api-access-5cbrc\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.309386 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309213 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-etc-modprobe-d\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.309386 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309238 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-etc-sysconfig\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.309386 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309249 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4d508526-4ec1-4ecd-be56-0426bc2e4469-etc-selinux\") pod \"aws-ebs-csi-driver-node-pbdzr\" (UID: \"4d508526-4ec1-4ecd-be56-0426bc2e4469\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" Apr 16 14:52:17.309386 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309264 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-etc-systemd\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.309386 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309292 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-hostroot\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.309386 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309317 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-systemd-units\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.309386 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309342 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-etc-modprobe-d\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.309386 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309374 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-hostroot\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.310109 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309403 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-etc-systemd\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.310109 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309429 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-etc-sysconfig\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.310109 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309458 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-systemd-units\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.310109 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309493 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5172b522-bc83-41f2-8760-e2fba5340ff1-cni-binary-copy\") pod \"multus-additional-cni-plugins-4mtb6\" (UID: \"5172b522-bc83-41f2-8760-e2fba5340ff1\") " pod="openshift-multus/multus-additional-cni-plugins-4mtb6" Apr 16 14:52:17.310109 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309503 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5172b522-bc83-41f2-8760-e2fba5340ff1-cnibin\") pod \"multus-additional-cni-plugins-4mtb6\" (UID: \"5172b522-bc83-41f2-8760-e2fba5340ff1\") " pod="openshift-multus/multus-additional-cni-plugins-4mtb6" Apr 16 14:52:17.310109 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309526 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-sys\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.310109 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309553 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-host\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.310109 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309577 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/40f100bd-4ea9-4c4c-bfe7-d00fbe4359f5-serviceca\") pod \"node-ca-fpnf7\" (UID: \"40f100bd-4ea9-4c4c-bfe7-d00fbe4359f5\") " pod="openshift-image-registry/node-ca-fpnf7" Apr 16 14:52:17.310109 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309619 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-multus-daemon-config\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.310109 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309632 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-host-run-ovn-kubernetes\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.310109 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309661 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/84ace2a9-8bcc-47b5-81bb-c764aa280104-ovnkube-script-lib\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.310109 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309699 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-host\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.310109 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309714 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pcz9\" (UniqueName: \"kubernetes.io/projected/84ace2a9-8bcc-47b5-81bb-c764aa280104-kube-api-access-7pcz9\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.310109 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309741 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8p2r\" (UniqueName: \"kubernetes.io/projected/fd96140b-7f7e-4208-9c8d-400e5a881b11-kube-api-access-v8p2r\") pod \"iptables-alerter-5zt6p\" (UID: \"fd96140b-7f7e-4208-9c8d-400e5a881b11\") " pod="openshift-network-operator/iptables-alerter-5zt6p" Apr 16 14:52:17.310109 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309747 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-host-run-ovn-kubernetes\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.310109 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309764 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cdb087ca-e5b6-43aa-88b4-f2d25147cf7e-tmp-dir\") pod \"node-resolver-f4596\" (UID: \"cdb087ca-e5b6-43aa-88b4-f2d25147cf7e\") " pod="openshift-dns/node-resolver-f4596" Apr 16 14:52:17.310109 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309788 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/40f100bd-4ea9-4c4c-bfe7-d00fbe4359f5-host\") pod \"node-ca-fpnf7\" (UID: \"40f100bd-4ea9-4c4c-bfe7-d00fbe4359f5\") " pod="openshift-image-registry/node-ca-fpnf7" Apr 16 14:52:17.310109 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309797 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/84ace2a9-8bcc-47b5-81bb-c764aa280104-ovnkube-config\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.311100 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309812 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-run-systemd\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.311100 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309803 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-sys\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.311100 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309838 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8m56t\" (UniqueName: \"kubernetes.io/projected/5172b522-bc83-41f2-8760-e2fba5340ff1-kube-api-access-8m56t\") pod \"multus-additional-cni-plugins-4mtb6\" (UID: \"5172b522-bc83-41f2-8760-e2fba5340ff1\") " pod="openshift-multus/multus-additional-cni-plugins-4mtb6" Apr 16 14:52:17.311100 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309866 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-etc-sysctl-conf\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.311100 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309890 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-var-lib-kubelet\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.311100 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309914 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fd96140b-7f7e-4208-9c8d-400e5a881b11-iptables-alerter-script\") pod \"iptables-alerter-5zt6p\" (UID: \"fd96140b-7f7e-4208-9c8d-400e5a881b11\") " pod="openshift-network-operator/iptables-alerter-5zt6p" Apr 16 14:52:17.311100 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309938 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-run-ovn\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.311100 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309963 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/233701c2-920c-46da-8d99-9ee0fe62c01a-agent-certs\") pod \"konnectivity-agent-ldwhr\" (UID: \"233701c2-920c-46da-8d99-9ee0fe62c01a\") " pod="kube-system/konnectivity-agent-ldwhr" Apr 16 14:52:17.311100 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.309988 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5172b522-bc83-41f2-8760-e2fba5340ff1-system-cni-dir\") pod \"multus-additional-cni-plugins-4mtb6\" (UID: \"5172b522-bc83-41f2-8760-e2fba5340ff1\") " pod="openshift-multus/multus-additional-cni-plugins-4mtb6" Apr 16 14:52:17.311100 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310010 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-cnibin\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.311100 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310033 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-etc-kubernetes\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.311100 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310051 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/40f100bd-4ea9-4c4c-bfe7-d00fbe4359f5-serviceca\") pod \"node-ca-fpnf7\" (UID: \"40f100bd-4ea9-4c4c-bfe7-d00fbe4359f5\") " pod="openshift-image-registry/node-ca-fpnf7" Apr 16 14:52:17.311100 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310072 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs\") pod \"network-metrics-daemon-j76vn\" (UID: \"38d86a56-d8b6-4bb2-a413-3166ca14717f\") " pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:17.311100 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310080 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5172b522-bc83-41f2-8760-e2fba5340ff1-cni-binary-copy\") pod \"multus-additional-cni-plugins-4mtb6\" (UID: \"5172b522-bc83-41f2-8760-e2fba5340ff1\") " pod="openshift-multus/multus-additional-cni-plugins-4mtb6" Apr 16 14:52:17.311100 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310119 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-var-lib-kubelet\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.311100 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310143 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-log-socket\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.311100 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310186 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d508526-4ec1-4ecd-be56-0426bc2e4469-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pbdzr\" (UID: \"4d508526-4ec1-4ecd-be56-0426bc2e4469\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" Apr 16 14:52:17.311100 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:17.310198 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:17.311833 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310211 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-lib-modules\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.311833 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310215 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5172b522-bc83-41f2-8760-e2fba5340ff1-system-cni-dir\") pod \"multus-additional-cni-plugins-4mtb6\" (UID: \"5172b522-bc83-41f2-8760-e2fba5340ff1\") " pod="openshift-multus/multus-additional-cni-plugins-4mtb6" Apr 16 14:52:17.311833 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310234 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-host-kubelet\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.311833 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:17.310270 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs podName:38d86a56-d8b6-4bb2-a413-3166ca14717f nodeName:}" failed. No retries permitted until 2026-04-16 14:52:17.810237485 +0000 UTC m=+2.151368318 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs") pod "network-metrics-daemon-j76vn" (UID: "38d86a56-d8b6-4bb2-a413-3166ca14717f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:17.311833 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310288 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-cnibin\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.311833 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310340 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-run-ovn\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.311833 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310389 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-etc-kubernetes\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.311833 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310434 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-etc-openvswitch\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.311833 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310467 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-etc-kubernetes\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.311833 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310438 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/40f100bd-4ea9-4c4c-bfe7-d00fbe4359f5-host\") pod \"node-ca-fpnf7\" (UID: \"40f100bd-4ea9-4c4c-bfe7-d00fbe4359f5\") " pod="openshift-image-registry/node-ca-fpnf7" Apr 16 14:52:17.311833 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310490 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/84ace2a9-8bcc-47b5-81bb-c764aa280104-ovnkube-script-lib\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.311833 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310494 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-run\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.311833 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310534 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-etc-openvswitch\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.311833 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310539 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-host-var-lib-kubelet\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.311833 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310538 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cdb087ca-e5b6-43aa-88b4-f2d25147cf7e-tmp-dir\") pod \"node-resolver-f4596\" (UID: \"cdb087ca-e5b6-43aa-88b4-f2d25147cf7e\") " pod="openshift-dns/node-resolver-f4596" Apr 16 14:52:17.311833 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310573 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-host-run-netns\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.311833 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310575 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-run-systemd\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.312611 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310623 2565 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 14:52:17.312611 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310632 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-etc-kubernetes\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.312611 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310653 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fd96140b-7f7e-4208-9c8d-400e5a881b11-iptables-alerter-script\") pod \"iptables-alerter-5zt6p\" (UID: \"fd96140b-7f7e-4208-9c8d-400e5a881b11\") " pod="openshift-network-operator/iptables-alerter-5zt6p" Apr 16 14:52:17.312611 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310667 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-host-var-lib-kubelet\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.312611 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310658 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-etc-sysctl-conf\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.312611 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310688 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-log-socket\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.312611 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310699 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-host-run-netns\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.312611 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310734 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-run\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.312611 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310750 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d508526-4ec1-4ecd-be56-0426bc2e4469-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pbdzr\" (UID: \"4d508526-4ec1-4ecd-be56-0426bc2e4469\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" Apr 16 14:52:17.312611 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310796 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-host-kubelet\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.312611 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310822 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5172b522-bc83-41f2-8760-e2fba5340ff1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4mtb6\" (UID: \"5172b522-bc83-41f2-8760-e2fba5340ff1\") " pod="openshift-multus/multus-additional-cni-plugins-4mtb6" Apr 16 14:52:17.312611 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310851 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-system-cni-dir\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.312611 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310875 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c578c137-c4e7-4fd5-8394-2a72b0661d12-etc-tuned\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.312611 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310898 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2g4cd\" (UniqueName: \"kubernetes.io/projected/c578c137-c4e7-4fd5-8394-2a72b0661d12-kube-api-access-2g4cd\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.312611 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310922 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cr4mk\" (UniqueName: \"kubernetes.io/projected/cdb087ca-e5b6-43aa-88b4-f2d25147cf7e-kube-api-access-cr4mk\") pod \"node-resolver-f4596\" (UID: \"cdb087ca-e5b6-43aa-88b4-f2d25147cf7e\") " pod="openshift-dns/node-resolver-f4596" Apr 16 14:52:17.312611 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310929 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-system-cni-dir\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.312611 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310945 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/84ace2a9-8bcc-47b5-81bb-c764aa280104-ovn-node-metrics-cert\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.312611 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.311012 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/233701c2-920c-46da-8d99-9ee0fe62c01a-konnectivity-ca\") pod \"konnectivity-agent-ldwhr\" (UID: \"233701c2-920c-46da-8d99-9ee0fe62c01a\") " pod="kube-system/konnectivity-agent-ldwhr" Apr 16 14:52:17.313425 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.311039 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4d508526-4ec1-4ecd-be56-0426bc2e4469-registration-dir\") pod \"aws-ebs-csi-driver-node-pbdzr\" (UID: \"4d508526-4ec1-4ecd-be56-0426bc2e4469\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" Apr 16 14:52:17.313425 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.311126 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5172b522-bc83-41f2-8760-e2fba5340ff1-os-release\") pod \"multus-additional-cni-plugins-4mtb6\" (UID: \"5172b522-bc83-41f2-8760-e2fba5340ff1\") " pod="openshift-multus/multus-additional-cni-plugins-4mtb6" Apr 16 14:52:17.313425 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.311185 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-cni-binary-copy\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.313425 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.311214 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-multus-socket-dir-parent\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.313425 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.311259 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-host-run-multus-certs\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.313425 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.311287 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4d508526-4ec1-4ecd-be56-0426bc2e4469-sys-fs\") pod \"aws-ebs-csi-driver-node-pbdzr\" (UID: \"4d508526-4ec1-4ecd-be56-0426bc2e4469\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" Apr 16 14:52:17.313425 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.311331 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5172b522-bc83-41f2-8760-e2fba5340ff1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4mtb6\" (UID: \"5172b522-bc83-41f2-8760-e2fba5340ff1\") " pod="openshift-multus/multus-additional-cni-plugins-4mtb6" Apr 16 14:52:17.313425 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.311378 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5172b522-bc83-41f2-8760-e2fba5340ff1-os-release\") pod \"multus-additional-cni-plugins-4mtb6\" (UID: \"5172b522-bc83-41f2-8760-e2fba5340ff1\") " pod="openshift-multus/multus-additional-cni-plugins-4mtb6" Apr 16 14:52:17.313425 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.310853 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-lib-modules\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.313425 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.311757 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4d508526-4ec1-4ecd-be56-0426bc2e4469-registration-dir\") pod \"aws-ebs-csi-driver-node-pbdzr\" (UID: \"4d508526-4ec1-4ecd-be56-0426bc2e4469\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" Apr 16 14:52:17.313425 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.311809 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-host-run-multus-certs\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.313425 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.311891 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/233701c2-920c-46da-8d99-9ee0fe62c01a-konnectivity-ca\") pod \"konnectivity-agent-ldwhr\" (UID: \"233701c2-920c-46da-8d99-9ee0fe62c01a\") " pod="kube-system/konnectivity-agent-ldwhr" Apr 16 14:52:17.313425 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.311949 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-multus-socket-dir-parent\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.313425 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.311955 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5172b522-bc83-41f2-8760-e2fba5340ff1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4mtb6\" (UID: \"5172b522-bc83-41f2-8760-e2fba5340ff1\") " pod="openshift-multus/multus-additional-cni-plugins-4mtb6" Apr 16 14:52:17.313425 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.312047 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4d508526-4ec1-4ecd-be56-0426bc2e4469-sys-fs\") pod \"aws-ebs-csi-driver-node-pbdzr\" (UID: \"4d508526-4ec1-4ecd-be56-0426bc2e4469\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" Apr 16 14:52:17.313425 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.312286 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-cni-binary-copy\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.313425 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.312615 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5172b522-bc83-41f2-8760-e2fba5340ff1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4mtb6\" (UID: \"5172b522-bc83-41f2-8760-e2fba5340ff1\") " pod="openshift-multus/multus-additional-cni-plugins-4mtb6" Apr 16 14:52:17.314112 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.311435 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5172b522-bc83-41f2-8760-e2fba5340ff1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4mtb6\" (UID: \"5172b522-bc83-41f2-8760-e2fba5340ff1\") " pod="openshift-multus/multus-additional-cni-plugins-4mtb6" Apr 16 14:52:17.314112 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.312717 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-host-var-lib-cni-multus\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.314112 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.312747 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-os-release\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.314112 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.312769 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-run-openvswitch\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.314112 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.312819 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-host-var-lib-cni-multus\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.314112 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.312828 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-os-release\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.314112 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.312862 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-run-openvswitch\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.314112 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.312883 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfdsn\" (UniqueName: \"kubernetes.io/projected/38d86a56-d8b6-4bb2-a413-3166ca14717f-kube-api-access-cfdsn\") pod \"network-metrics-daemon-j76vn\" (UID: \"38d86a56-d8b6-4bb2-a413-3166ca14717f\") " pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:17.314112 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.312902 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-host-slash\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.314112 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.312926 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-multus-conf-dir\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.314112 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.312950 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-etc-sysctl-d\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.314112 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.312973 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-host-run-netns\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.314112 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.312998 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-host-cni-bin\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.314112 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.313007 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-multus-conf-dir\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.314112 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.313022 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4d508526-4ec1-4ecd-be56-0426bc2e4469-device-dir\") pod \"aws-ebs-csi-driver-node-pbdzr\" (UID: \"4d508526-4ec1-4ecd-be56-0426bc2e4469\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" Apr 16 14:52:17.314112 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.313045 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c578c137-c4e7-4fd5-8394-2a72b0661d12-tmp\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.314112 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.313051 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-host-slash\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.314112 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.313070 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fd96140b-7f7e-4208-9c8d-400e5a881b11-host-slash\") pod \"iptables-alerter-5zt6p\" (UID: \"fd96140b-7f7e-4208-9c8d-400e5a881b11\") " pod="openshift-network-operator/iptables-alerter-5zt6p" Apr 16 14:52:17.314794 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.313089 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-host-cni-bin\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.314794 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.313094 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cdb087ca-e5b6-43aa-88b4-f2d25147cf7e-hosts-file\") pod \"node-resolver-f4596\" (UID: \"cdb087ca-e5b6-43aa-88b4-f2d25147cf7e\") " pod="openshift-dns/node-resolver-f4596" Apr 16 14:52:17.314794 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.313118 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s6v76\" (UniqueName: \"kubernetes.io/projected/40f100bd-4ea9-4c4c-bfe7-d00fbe4359f5-kube-api-access-s6v76\") pod \"node-ca-fpnf7\" (UID: \"40f100bd-4ea9-4c4c-bfe7-d00fbe4359f5\") " pod="openshift-image-registry/node-ca-fpnf7" Apr 16 14:52:17.314794 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.313130 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/84ace2a9-8bcc-47b5-81bb-c764aa280104-host-run-netns\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.314794 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.313142 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/84ace2a9-8bcc-47b5-81bb-c764aa280104-env-overrides\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.314794 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.313153 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c578c137-c4e7-4fd5-8394-2a72b0661d12-etc-sysctl-d\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.314794 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.313209 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fd96140b-7f7e-4208-9c8d-400e5a881b11-host-slash\") pod \"iptables-alerter-5zt6p\" (UID: \"fd96140b-7f7e-4208-9c8d-400e5a881b11\") " pod="openshift-network-operator/iptables-alerter-5zt6p" Apr 16 14:52:17.314794 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.313221 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4d508526-4ec1-4ecd-be56-0426bc2e4469-device-dir\") pod \"aws-ebs-csi-driver-node-pbdzr\" (UID: \"4d508526-4ec1-4ecd-be56-0426bc2e4469\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" Apr 16 14:52:17.314794 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.313253 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cdb087ca-e5b6-43aa-88b4-f2d25147cf7e-hosts-file\") pod \"node-resolver-f4596\" (UID: \"cdb087ca-e5b6-43aa-88b4-f2d25147cf7e\") " pod="openshift-dns/node-resolver-f4596" Apr 16 14:52:17.314794 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.313292 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4d508526-4ec1-4ecd-be56-0426bc2e4469-socket-dir\") pod \"aws-ebs-csi-driver-node-pbdzr\" (UID: \"4d508526-4ec1-4ecd-be56-0426bc2e4469\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" Apr 16 14:52:17.314794 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.313319 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b88ww\" (UniqueName: \"kubernetes.io/projected/4d508526-4ec1-4ecd-be56-0426bc2e4469-kube-api-access-b88ww\") pod \"aws-ebs-csi-driver-node-pbdzr\" (UID: \"4d508526-4ec1-4ecd-be56-0426bc2e4469\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" Apr 16 14:52:17.314794 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.313385 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5172b522-bc83-41f2-8760-e2fba5340ff1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4mtb6\" (UID: \"5172b522-bc83-41f2-8760-e2fba5340ff1\") " pod="openshift-multus/multus-additional-cni-plugins-4mtb6" Apr 16 14:52:17.314794 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.313731 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4d508526-4ec1-4ecd-be56-0426bc2e4469-socket-dir\") pod \"aws-ebs-csi-driver-node-pbdzr\" (UID: \"4d508526-4ec1-4ecd-be56-0426bc2e4469\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" Apr 16 14:52:17.314794 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.314080 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/84ace2a9-8bcc-47b5-81bb-c764aa280104-env-overrides\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.314794 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.314122 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c578c137-c4e7-4fd5-8394-2a72b0661d12-etc-tuned\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.314794 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.314296 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/84ace2a9-8bcc-47b5-81bb-c764aa280104-ovn-node-metrics-cert\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.314794 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:17.314353 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:17.314794 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:17.314372 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:17.315313 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:17.314381 2565 projected.go:194] Error preparing data for projected volume kube-api-access-xpsk8 for pod openshift-network-diagnostics/network-check-target-d7tkp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:17.315313 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.314409 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/233701c2-920c-46da-8d99-9ee0fe62c01a-agent-certs\") pod \"konnectivity-agent-ldwhr\" (UID: \"233701c2-920c-46da-8d99-9ee0fe62c01a\") " pod="kube-system/konnectivity-agent-ldwhr" Apr 16 14:52:17.315313 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:17.314505 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82d43552-0266-40be-b011-548c6b1da18a-kube-api-access-xpsk8 podName:82d43552-0266-40be-b011-548c6b1da18a nodeName:}" failed. No retries permitted until 2026-04-16 14:52:17.814492589 +0000 UTC m=+2.155623426 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xpsk8" (UniqueName: "kubernetes.io/projected/82d43552-0266-40be-b011-548c6b1da18a-kube-api-access-xpsk8") pod "network-check-target-d7tkp" (UID: "82d43552-0266-40be-b011-548c6b1da18a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:17.315406 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.315337 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c578c137-c4e7-4fd5-8394-2a72b0661d12-tmp\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.316217 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.316200 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cbrc\" (UniqueName: \"kubernetes.io/projected/5bd34617-f0f5-4b74-b464-a7613ad4c7a9-kube-api-access-5cbrc\") pod \"multus-jf55j\" (UID: \"5bd34617-f0f5-4b74-b464-a7613ad4c7a9\") " pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.316313 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.316297 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8p2r\" (UniqueName: \"kubernetes.io/projected/fd96140b-7f7e-4208-9c8d-400e5a881b11-kube-api-access-v8p2r\") pod \"iptables-alerter-5zt6p\" (UID: \"fd96140b-7f7e-4208-9c8d-400e5a881b11\") " pod="openshift-network-operator/iptables-alerter-5zt6p" Apr 16 14:52:17.317726 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.317705 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pcz9\" (UniqueName: \"kubernetes.io/projected/84ace2a9-8bcc-47b5-81bb-c764aa280104-kube-api-access-7pcz9\") pod \"ovnkube-node-t4krl\" (UID: \"84ace2a9-8bcc-47b5-81bb-c764aa280104\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.318694 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.318677 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr4mk\" (UniqueName: \"kubernetes.io/projected/cdb087ca-e5b6-43aa-88b4-f2d25147cf7e-kube-api-access-cr4mk\") pod \"node-resolver-f4596\" (UID: \"cdb087ca-e5b6-43aa-88b4-f2d25147cf7e\") " pod="openshift-dns/node-resolver-f4596" Apr 16 14:52:17.319051 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.319035 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m56t\" (UniqueName: \"kubernetes.io/projected/5172b522-bc83-41f2-8760-e2fba5340ff1-kube-api-access-8m56t\") pod \"multus-additional-cni-plugins-4mtb6\" (UID: \"5172b522-bc83-41f2-8760-e2fba5340ff1\") " pod="openshift-multus/multus-additional-cni-plugins-4mtb6" Apr 16 14:52:17.319098 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.319063 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g4cd\" (UniqueName: \"kubernetes.io/projected/c578c137-c4e7-4fd5-8394-2a72b0661d12-kube-api-access-2g4cd\") pod \"tuned-cj56j\" (UID: \"c578c137-c4e7-4fd5-8394-2a72b0661d12\") " pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.322911 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.322887 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfdsn\" (UniqueName: \"kubernetes.io/projected/38d86a56-d8b6-4bb2-a413-3166ca14717f-kube-api-access-cfdsn\") pod \"network-metrics-daemon-j76vn\" (UID: \"38d86a56-d8b6-4bb2-a413-3166ca14717f\") " pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:17.323609 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.323589 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6v76\" (UniqueName: \"kubernetes.io/projected/40f100bd-4ea9-4c4c-bfe7-d00fbe4359f5-kube-api-access-s6v76\") pod \"node-ca-fpnf7\" (UID: \"40f100bd-4ea9-4c4c-bfe7-d00fbe4359f5\") " pod="openshift-image-registry/node-ca-fpnf7" Apr 16 14:52:17.324973 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.324935 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-46.ec2.internal" event={"ID":"540800c0848b372f11baec71c30048e7","Type":"ContainerStarted","Data":"c62b093e667193f435c42fb9e32273e3488fd4be6d964745e5cdbaf65ff0d920"} Apr 16 14:52:17.325550 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.325532 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b88ww\" (UniqueName: \"kubernetes.io/projected/4d508526-4ec1-4ecd-be56-0426bc2e4469-kube-api-access-b88ww\") pod \"aws-ebs-csi-driver-node-pbdzr\" (UID: \"4d508526-4ec1-4ecd-be56-0426bc2e4469\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" Apr 16 14:52:17.325910 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.325892 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-46.ec2.internal" event={"ID":"4c16a30a2810a56bab96c6b9f99d7cfc","Type":"ContainerStarted","Data":"b2a1c4bc7209fd24f0d1e24492a9f216dc8fff5f34ef5294fe3df90eb8695948"} Apr 16 14:52:17.505046 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.505014 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jf55j" Apr 16 14:52:17.510593 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:17.510558 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bd34617_f0f5_4b74_b464_a7613ad4c7a9.slice/crio-b31adb592ec1e6df05b595b7155586d3b7b076879fcb52471f03badf542460bc WatchSource:0}: Error finding container b31adb592ec1e6df05b595b7155586d3b7b076879fcb52471f03badf542460bc: Status 404 returned error can't find the container with id b31adb592ec1e6df05b595b7155586d3b7b076879fcb52471f03badf542460bc Apr 16 14:52:17.529559 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.529536 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:17.534938 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:17.534914 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84ace2a9_8bcc_47b5_81bb_c764aa280104.slice/crio-126e0fb4969444849114d61407895ae8eb72fb1f612315e7a0071755b8463fdf WatchSource:0}: Error finding container 126e0fb4969444849114d61407895ae8eb72fb1f612315e7a0071755b8463fdf: Status 404 returned error can't find the container with id 126e0fb4969444849114d61407895ae8eb72fb1f612315e7a0071755b8463fdf Apr 16 14:52:17.547892 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.547873 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ldwhr" Apr 16 14:52:17.554672 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:17.554652 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod233701c2_920c_46da_8d99_9ee0fe62c01a.slice/crio-919249350a85cabbe756536347c710c67200642f4679aae46beedf19c0588068 WatchSource:0}: Error finding container 919249350a85cabbe756536347c710c67200642f4679aae46beedf19c0588068: Status 404 returned error can't find the container with id 919249350a85cabbe756536347c710c67200642f4679aae46beedf19c0588068 Apr 16 14:52:17.565567 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.565546 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" Apr 16 14:52:17.571002 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:17.570979 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d508526_4ec1_4ecd_be56_0426bc2e4469.slice/crio-ecde8bc17cd73d64c393a6e08baf0a95079e206fe5d6b23c5dd788e12a6dccc1 WatchSource:0}: Error finding container ecde8bc17cd73d64c393a6e08baf0a95079e206fe5d6b23c5dd788e12a6dccc1: Status 404 returned error can't find the container with id ecde8bc17cd73d64c393a6e08baf0a95079e206fe5d6b23c5dd788e12a6dccc1 Apr 16 14:52:17.583112 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.583090 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-cj56j" Apr 16 14:52:17.588310 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:17.588287 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc578c137_c4e7_4fd5_8394_2a72b0661d12.slice/crio-cf43d591a2aec8485dbf93dbd27ad90d4108fb43256b7468cca2c1b321a87872 WatchSource:0}: Error finding container cf43d591a2aec8485dbf93dbd27ad90d4108fb43256b7468cca2c1b321a87872: Status 404 returned error can't find the container with id cf43d591a2aec8485dbf93dbd27ad90d4108fb43256b7468cca2c1b321a87872 Apr 16 14:52:17.594048 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.594026 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5zt6p" Apr 16 14:52:17.599770 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.599751 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-f4596" Apr 16 14:52:17.599943 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:17.599923 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd96140b_7f7e_4208_9c8d_400e5a881b11.slice/crio-ae49844ee073912f72db3fb5dd8200f5b7968b5da13ab287c29bf4b01945ea87 WatchSource:0}: Error finding container ae49844ee073912f72db3fb5dd8200f5b7968b5da13ab287c29bf4b01945ea87: Status 404 returned error can't find the container with id ae49844ee073912f72db3fb5dd8200f5b7968b5da13ab287c29bf4b01945ea87 Apr 16 14:52:17.605477 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.605337 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fpnf7" Apr 16 14:52:17.607692 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:17.607670 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdb087ca_e5b6_43aa_88b4_f2d25147cf7e.slice/crio-e3f35fb6e3424c9a4be39fa7a1f2a1e9ab8f2dbd81941a0542366b579d66e687 WatchSource:0}: Error finding container e3f35fb6e3424c9a4be39fa7a1f2a1e9ab8f2dbd81941a0542366b579d66e687: Status 404 returned error can't find the container with id e3f35fb6e3424c9a4be39fa7a1f2a1e9ab8f2dbd81941a0542366b579d66e687 Apr 16 14:52:17.612470 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:17.612450 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40f100bd_4ea9_4c4c_bfe7_d00fbe4359f5.slice/crio-597c71404b4f2f312ba2f2bc206bad2dac0c783b13921159b132ff70f424b5fe WatchSource:0}: Error finding container 597c71404b4f2f312ba2f2bc206bad2dac0c783b13921159b132ff70f424b5fe: Status 404 returned error can't find the container with id 597c71404b4f2f312ba2f2bc206bad2dac0c783b13921159b132ff70f424b5fe Apr 16 14:52:17.613785 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.613767 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4mtb6" Apr 16 14:52:17.619446 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:17.619429 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5172b522_bc83_41f2_8760_e2fba5340ff1.slice/crio-9449c36895f8c142c2f0001225f68cd80e27cbabcf316230073c8b4ec7f452a5 WatchSource:0}: Error finding container 9449c36895f8c142c2f0001225f68cd80e27cbabcf316230073c8b4ec7f452a5: Status 404 returned error can't find the container with id 9449c36895f8c142c2f0001225f68cd80e27cbabcf316230073c8b4ec7f452a5 Apr 16 14:52:17.817827 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.817732 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpsk8\" (UniqueName: \"kubernetes.io/projected/82d43552-0266-40be-b011-548c6b1da18a-kube-api-access-xpsk8\") pod \"network-check-target-d7tkp\" (UID: \"82d43552-0266-40be-b011-548c6b1da18a\") " pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:52:17.817827 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:17.817802 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs\") pod \"network-metrics-daemon-j76vn\" (UID: \"38d86a56-d8b6-4bb2-a413-3166ca14717f\") " pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:17.818049 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:17.817935 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:17.818049 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:17.817994 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs podName:38d86a56-d8b6-4bb2-a413-3166ca14717f nodeName:}" failed. No retries permitted until 2026-04-16 14:52:18.817974165 +0000 UTC m=+3.159105004 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs") pod "network-metrics-daemon-j76vn" (UID: "38d86a56-d8b6-4bb2-a413-3166ca14717f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:17.818154 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:17.818060 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:17.818154 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:17.818077 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:17.818154 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:17.818091 2565 projected.go:194] Error preparing data for projected volume kube-api-access-xpsk8 for pod openshift-network-diagnostics/network-check-target-d7tkp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:17.818154 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:17.818129 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82d43552-0266-40be-b011-548c6b1da18a-kube-api-access-xpsk8 podName:82d43552-0266-40be-b011-548c6b1da18a nodeName:}" failed. No retries permitted until 2026-04-16 14:52:18.818116646 +0000 UTC m=+3.159247479 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xpsk8" (UniqueName: "kubernetes.io/projected/82d43552-0266-40be-b011-548c6b1da18a-kube-api-access-xpsk8") pod "network-check-target-d7tkp" (UID: "82d43552-0266-40be-b011-548c6b1da18a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:18.016561 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:18.016342 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:18.207308 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:18.207073 2565 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:18.243430 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:18.243359 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:47:17 +0000 UTC" deadline="2027-12-03 05:29:52.863514694 +0000 UTC" Apr 16 14:52:18.243430 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:18.243392 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14294h37m34.620126812s" Apr 16 14:52:18.322783 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:18.322741 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:18.322969 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:18.322893 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j76vn" podUID="38d86a56-d8b6-4bb2-a413-3166ca14717f" Apr 16 14:52:18.349111 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:18.349077 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fpnf7" event={"ID":"40f100bd-4ea9-4c4c-bfe7-d00fbe4359f5","Type":"ContainerStarted","Data":"597c71404b4f2f312ba2f2bc206bad2dac0c783b13921159b132ff70f424b5fe"} Apr 16 14:52:18.352002 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:18.351954 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5zt6p" event={"ID":"fd96140b-7f7e-4208-9c8d-400e5a881b11","Type":"ContainerStarted","Data":"ae49844ee073912f72db3fb5dd8200f5b7968b5da13ab287c29bf4b01945ea87"} Apr 16 14:52:18.360604 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:18.360534 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" event={"ID":"4d508526-4ec1-4ecd-be56-0426bc2e4469","Type":"ContainerStarted","Data":"ecde8bc17cd73d64c393a6e08baf0a95079e206fe5d6b23c5dd788e12a6dccc1"} Apr 16 14:52:18.369216 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:18.369137 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" event={"ID":"84ace2a9-8bcc-47b5-81bb-c764aa280104","Type":"ContainerStarted","Data":"126e0fb4969444849114d61407895ae8eb72fb1f612315e7a0071755b8463fdf"} Apr 16 14:52:18.394472 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:18.394385 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jf55j" event={"ID":"5bd34617-f0f5-4b74-b464-a7613ad4c7a9","Type":"ContainerStarted","Data":"b31adb592ec1e6df05b595b7155586d3b7b076879fcb52471f03badf542460bc"} Apr 16 14:52:18.412226 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:18.412188 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4mtb6" event={"ID":"5172b522-bc83-41f2-8760-e2fba5340ff1","Type":"ContainerStarted","Data":"9449c36895f8c142c2f0001225f68cd80e27cbabcf316230073c8b4ec7f452a5"} Apr 16 14:52:18.413470 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:18.413444 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f4596" event={"ID":"cdb087ca-e5b6-43aa-88b4-f2d25147cf7e","Type":"ContainerStarted","Data":"e3f35fb6e3424c9a4be39fa7a1f2a1e9ab8f2dbd81941a0542366b579d66e687"} Apr 16 14:52:18.422789 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:18.422748 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-cj56j" event={"ID":"c578c137-c4e7-4fd5-8394-2a72b0661d12","Type":"ContainerStarted","Data":"cf43d591a2aec8485dbf93dbd27ad90d4108fb43256b7468cca2c1b321a87872"} Apr 16 14:52:18.435033 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:18.435002 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ldwhr" event={"ID":"233701c2-920c-46da-8d99-9ee0fe62c01a","Type":"ContainerStarted","Data":"919249350a85cabbe756536347c710c67200642f4679aae46beedf19c0588068"} Apr 16 14:52:18.485351 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:18.485318 2565 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:18.825685 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:18.825647 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs\") pod \"network-metrics-daemon-j76vn\" (UID: \"38d86a56-d8b6-4bb2-a413-3166ca14717f\") " pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:18.825864 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:18.825717 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpsk8\" (UniqueName: \"kubernetes.io/projected/82d43552-0266-40be-b011-548c6b1da18a-kube-api-access-xpsk8\") pod \"network-check-target-d7tkp\" (UID: \"82d43552-0266-40be-b011-548c6b1da18a\") " pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:52:18.825922 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:18.825860 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:18.825922 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:18.825877 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:18.825922 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:18.825890 2565 projected.go:194] Error preparing data for projected volume kube-api-access-xpsk8 for pod openshift-network-diagnostics/network-check-target-d7tkp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:18.826084 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:18.825948 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82d43552-0266-40be-b011-548c6b1da18a-kube-api-access-xpsk8 podName:82d43552-0266-40be-b011-548c6b1da18a nodeName:}" failed. No retries permitted until 2026-04-16 14:52:20.825929567 +0000 UTC m=+5.167060417 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-xpsk8" (UniqueName: "kubernetes.io/projected/82d43552-0266-40be-b011-548c6b1da18a-kube-api-access-xpsk8") pod "network-check-target-d7tkp" (UID: "82d43552-0266-40be-b011-548c6b1da18a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:18.826409 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:18.826391 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:18.826499 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:18.826446 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs podName:38d86a56-d8b6-4bb2-a413-3166ca14717f nodeName:}" failed. No retries permitted until 2026-04-16 14:52:20.826429689 +0000 UTC m=+5.167560524 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs") pod "network-metrics-daemon-j76vn" (UID: "38d86a56-d8b6-4bb2-a413-3166ca14717f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:19.244372 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:19.244274 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:47:17 +0000 UTC" deadline="2027-11-21 08:57:44.344507467 +0000 UTC" Apr 16 14:52:19.244372 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:19.244314 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14010h5m25.100196886s" Apr 16 14:52:19.323090 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:19.322559 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:52:19.323090 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:19.322675 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7tkp" podUID="82d43552-0266-40be-b011-548c6b1da18a" Apr 16 14:52:19.879077 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:19.879044 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-gc9vp"] Apr 16 14:52:19.881446 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:19.881425 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:19.881584 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:19.881500 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gc9vp" podUID="55745b2f-83f9-46da-95cf-59aa391f6226" Apr 16 14:52:19.934101 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:19.934065 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/55745b2f-83f9-46da-95cf-59aa391f6226-kubelet-config\") pod \"global-pull-secret-syncer-gc9vp\" (UID: \"55745b2f-83f9-46da-95cf-59aa391f6226\") " pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:19.934394 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:19.934133 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/55745b2f-83f9-46da-95cf-59aa391f6226-dbus\") pod \"global-pull-secret-syncer-gc9vp\" (UID: \"55745b2f-83f9-46da-95cf-59aa391f6226\") " pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:19.934394 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:19.934234 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/55745b2f-83f9-46da-95cf-59aa391f6226-original-pull-secret\") pod \"global-pull-secret-syncer-gc9vp\" (UID: \"55745b2f-83f9-46da-95cf-59aa391f6226\") " pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:20.034798 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:20.034754 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/55745b2f-83f9-46da-95cf-59aa391f6226-original-pull-secret\") pod \"global-pull-secret-syncer-gc9vp\" (UID: \"55745b2f-83f9-46da-95cf-59aa391f6226\") " pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:20.034978 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:20.034827 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/55745b2f-83f9-46da-95cf-59aa391f6226-kubelet-config\") pod \"global-pull-secret-syncer-gc9vp\" (UID: \"55745b2f-83f9-46da-95cf-59aa391f6226\") " pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:20.034978 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:20.034872 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/55745b2f-83f9-46da-95cf-59aa391f6226-dbus\") pod \"global-pull-secret-syncer-gc9vp\" (UID: \"55745b2f-83f9-46da-95cf-59aa391f6226\") " pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:20.035111 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:20.035063 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/55745b2f-83f9-46da-95cf-59aa391f6226-dbus\") pod \"global-pull-secret-syncer-gc9vp\" (UID: \"55745b2f-83f9-46da-95cf-59aa391f6226\") " pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:20.036016 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:20.035270 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/55745b2f-83f9-46da-95cf-59aa391f6226-kubelet-config\") pod \"global-pull-secret-syncer-gc9vp\" (UID: \"55745b2f-83f9-46da-95cf-59aa391f6226\") " pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:20.036016 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:20.035342 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:20.036016 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:20.035405 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55745b2f-83f9-46da-95cf-59aa391f6226-original-pull-secret podName:55745b2f-83f9-46da-95cf-59aa391f6226 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:20.535385492 +0000 UTC m=+4.876516332 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/55745b2f-83f9-46da-95cf-59aa391f6226-original-pull-secret") pod "global-pull-secret-syncer-gc9vp" (UID: "55745b2f-83f9-46da-95cf-59aa391f6226") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:20.325404 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:20.325371 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:20.325867 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:20.325516 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j76vn" podUID="38d86a56-d8b6-4bb2-a413-3166ca14717f" Apr 16 14:52:20.539613 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:20.539575 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/55745b2f-83f9-46da-95cf-59aa391f6226-original-pull-secret\") pod \"global-pull-secret-syncer-gc9vp\" (UID: \"55745b2f-83f9-46da-95cf-59aa391f6226\") " pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:20.539785 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:20.539719 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:20.539785 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:20.539782 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55745b2f-83f9-46da-95cf-59aa391f6226-original-pull-secret podName:55745b2f-83f9-46da-95cf-59aa391f6226 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:21.53976312 +0000 UTC m=+5.880893956 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/55745b2f-83f9-46da-95cf-59aa391f6226-original-pull-secret") pod "global-pull-secret-syncer-gc9vp" (UID: "55745b2f-83f9-46da-95cf-59aa391f6226") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:20.843559 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:20.843517 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs\") pod \"network-metrics-daemon-j76vn\" (UID: \"38d86a56-d8b6-4bb2-a413-3166ca14717f\") " pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:20.843760 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:20.843633 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpsk8\" (UniqueName: \"kubernetes.io/projected/82d43552-0266-40be-b011-548c6b1da18a-kube-api-access-xpsk8\") pod \"network-check-target-d7tkp\" (UID: \"82d43552-0266-40be-b011-548c6b1da18a\") " pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:52:20.843835 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:20.843809 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:20.843886 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:20.843841 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:20.843886 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:20.843855 2565 projected.go:194] Error preparing data for projected volume kube-api-access-xpsk8 for pod openshift-network-diagnostics/network-check-target-d7tkp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:20.843986 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:20.843915 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82d43552-0266-40be-b011-548c6b1da18a-kube-api-access-xpsk8 podName:82d43552-0266-40be-b011-548c6b1da18a nodeName:}" failed. No retries permitted until 2026-04-16 14:52:24.843895449 +0000 UTC m=+9.185026296 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-xpsk8" (UniqueName: "kubernetes.io/projected/82d43552-0266-40be-b011-548c6b1da18a-kube-api-access-xpsk8") pod "network-check-target-d7tkp" (UID: "82d43552-0266-40be-b011-548c6b1da18a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:20.844062 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:20.843996 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:20.844062 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:20.844034 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs podName:38d86a56-d8b6-4bb2-a413-3166ca14717f nodeName:}" failed. No retries permitted until 2026-04-16 14:52:24.844022035 +0000 UTC m=+9.185152873 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs") pod "network-metrics-daemon-j76vn" (UID: "38d86a56-d8b6-4bb2-a413-3166ca14717f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:21.323066 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:21.323028 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:52:21.323273 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:21.323156 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7tkp" podUID="82d43552-0266-40be-b011-548c6b1da18a" Apr 16 14:52:21.323361 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:21.323300 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:21.323431 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:21.323394 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gc9vp" podUID="55745b2f-83f9-46da-95cf-59aa391f6226" Apr 16 14:52:21.552491 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:21.551782 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/55745b2f-83f9-46da-95cf-59aa391f6226-original-pull-secret\") pod \"global-pull-secret-syncer-gc9vp\" (UID: \"55745b2f-83f9-46da-95cf-59aa391f6226\") " pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:21.552491 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:21.552077 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:21.552491 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:21.552143 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55745b2f-83f9-46da-95cf-59aa391f6226-original-pull-secret podName:55745b2f-83f9-46da-95cf-59aa391f6226 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:23.55212418 +0000 UTC m=+7.893255017 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/55745b2f-83f9-46da-95cf-59aa391f6226-original-pull-secret") pod "global-pull-secret-syncer-gc9vp" (UID: "55745b2f-83f9-46da-95cf-59aa391f6226") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:22.323252 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:22.323160 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:22.323416 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:22.323371 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j76vn" podUID="38d86a56-d8b6-4bb2-a413-3166ca14717f" Apr 16 14:52:23.322844 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:23.322812 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:52:23.323356 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:23.322812 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:23.323356 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:23.322938 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7tkp" podUID="82d43552-0266-40be-b011-548c6b1da18a" Apr 16 14:52:23.323356 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:23.323018 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gc9vp" podUID="55745b2f-83f9-46da-95cf-59aa391f6226" Apr 16 14:52:23.569183 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:23.569130 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/55745b2f-83f9-46da-95cf-59aa391f6226-original-pull-secret\") pod \"global-pull-secret-syncer-gc9vp\" (UID: \"55745b2f-83f9-46da-95cf-59aa391f6226\") " pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:23.569367 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:23.569294 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:23.569367 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:23.569366 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55745b2f-83f9-46da-95cf-59aa391f6226-original-pull-secret podName:55745b2f-83f9-46da-95cf-59aa391f6226 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:27.569346036 +0000 UTC m=+11.910476886 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/55745b2f-83f9-46da-95cf-59aa391f6226-original-pull-secret") pod "global-pull-secret-syncer-gc9vp" (UID: "55745b2f-83f9-46da-95cf-59aa391f6226") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:24.322903 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:24.322866 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:24.323370 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:24.323029 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j76vn" podUID="38d86a56-d8b6-4bb2-a413-3166ca14717f" Apr 16 14:52:24.880078 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:24.880025 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs\") pod \"network-metrics-daemon-j76vn\" (UID: \"38d86a56-d8b6-4bb2-a413-3166ca14717f\") " pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:24.880315 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:24.880221 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpsk8\" (UniqueName: \"kubernetes.io/projected/82d43552-0266-40be-b011-548c6b1da18a-kube-api-access-xpsk8\") pod \"network-check-target-d7tkp\" (UID: \"82d43552-0266-40be-b011-548c6b1da18a\") " pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:52:24.880315 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:24.880247 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:24.880468 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:24.880328 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs podName:38d86a56-d8b6-4bb2-a413-3166ca14717f nodeName:}" failed. No retries permitted until 2026-04-16 14:52:32.880305506 +0000 UTC m=+17.221436352 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs") pod "network-metrics-daemon-j76vn" (UID: "38d86a56-d8b6-4bb2-a413-3166ca14717f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:24.880468 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:24.880356 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:24.880468 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:24.880374 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:24.880468 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:24.880386 2565 projected.go:194] Error preparing data for projected volume kube-api-access-xpsk8 for pod openshift-network-diagnostics/network-check-target-d7tkp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:24.880468 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:24.880434 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82d43552-0266-40be-b011-548c6b1da18a-kube-api-access-xpsk8 podName:82d43552-0266-40be-b011-548c6b1da18a nodeName:}" failed. No retries permitted until 2026-04-16 14:52:32.880422049 +0000 UTC m=+17.221552881 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-xpsk8" (UniqueName: "kubernetes.io/projected/82d43552-0266-40be-b011-548c6b1da18a-kube-api-access-xpsk8") pod "network-check-target-d7tkp" (UID: "82d43552-0266-40be-b011-548c6b1da18a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:25.323150 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:25.323114 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:25.323150 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:25.323136 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:52:25.323680 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:25.323279 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gc9vp" podUID="55745b2f-83f9-46da-95cf-59aa391f6226" Apr 16 14:52:25.323680 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:25.323388 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7tkp" podUID="82d43552-0266-40be-b011-548c6b1da18a" Apr 16 14:52:26.323548 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:26.323462 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:26.323982 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:26.323591 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j76vn" podUID="38d86a56-d8b6-4bb2-a413-3166ca14717f" Apr 16 14:52:27.323039 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:27.322868 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:27.323244 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:27.322868 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:52:27.323244 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:27.323143 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gc9vp" podUID="55745b2f-83f9-46da-95cf-59aa391f6226" Apr 16 14:52:27.323244 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:27.323199 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7tkp" podUID="82d43552-0266-40be-b011-548c6b1da18a" Apr 16 14:52:27.602124 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:27.602032 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/55745b2f-83f9-46da-95cf-59aa391f6226-original-pull-secret\") pod \"global-pull-secret-syncer-gc9vp\" (UID: \"55745b2f-83f9-46da-95cf-59aa391f6226\") " pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:27.602578 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:27.602220 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:27.602578 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:27.602298 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55745b2f-83f9-46da-95cf-59aa391f6226-original-pull-secret podName:55745b2f-83f9-46da-95cf-59aa391f6226 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:35.602277169 +0000 UTC m=+19.943408003 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/55745b2f-83f9-46da-95cf-59aa391f6226-original-pull-secret") pod "global-pull-secret-syncer-gc9vp" (UID: "55745b2f-83f9-46da-95cf-59aa391f6226") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:28.322887 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:28.322846 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:28.323087 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:28.322970 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j76vn" podUID="38d86a56-d8b6-4bb2-a413-3166ca14717f" Apr 16 14:52:29.323096 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:29.323067 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:52:29.323502 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:29.323067 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:29.323502 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:29.323203 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7tkp" podUID="82d43552-0266-40be-b011-548c6b1da18a" Apr 16 14:52:29.323502 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:29.323299 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gc9vp" podUID="55745b2f-83f9-46da-95cf-59aa391f6226" Apr 16 14:52:30.323328 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:30.323289 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:30.323728 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:30.323422 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j76vn" podUID="38d86a56-d8b6-4bb2-a413-3166ca14717f" Apr 16 14:52:31.323007 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:31.322923 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:31.323007 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:31.322936 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:52:31.323250 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:31.323067 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gc9vp" podUID="55745b2f-83f9-46da-95cf-59aa391f6226" Apr 16 14:52:31.323250 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:31.323154 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7tkp" podUID="82d43552-0266-40be-b011-548c6b1da18a" Apr 16 14:52:32.323442 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:32.323402 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:32.323911 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:32.323549 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j76vn" podUID="38d86a56-d8b6-4bb2-a413-3166ca14717f" Apr 16 14:52:32.940319 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:32.940279 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs\") pod \"network-metrics-daemon-j76vn\" (UID: \"38d86a56-d8b6-4bb2-a413-3166ca14717f\") " pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:32.940546 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:32.940373 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpsk8\" (UniqueName: \"kubernetes.io/projected/82d43552-0266-40be-b011-548c6b1da18a-kube-api-access-xpsk8\") pod \"network-check-target-d7tkp\" (UID: \"82d43552-0266-40be-b011-548c6b1da18a\") " pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:52:32.940546 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:32.940439 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:32.940546 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:32.940506 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs podName:38d86a56-d8b6-4bb2-a413-3166ca14717f nodeName:}" failed. No retries permitted until 2026-04-16 14:52:48.940486937 +0000 UTC m=+33.281617775 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs") pod "network-metrics-daemon-j76vn" (UID: "38d86a56-d8b6-4bb2-a413-3166ca14717f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:32.940546 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:32.940518 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:32.940546 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:32.940546 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:32.940795 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:32.940559 2565 projected.go:194] Error preparing data for projected volume kube-api-access-xpsk8 for pod openshift-network-diagnostics/network-check-target-d7tkp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:32.940795 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:32.940614 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82d43552-0266-40be-b011-548c6b1da18a-kube-api-access-xpsk8 podName:82d43552-0266-40be-b011-548c6b1da18a nodeName:}" failed. No retries permitted until 2026-04-16 14:52:48.940597865 +0000 UTC m=+33.281728715 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-xpsk8" (UniqueName: "kubernetes.io/projected/82d43552-0266-40be-b011-548c6b1da18a-kube-api-access-xpsk8") pod "network-check-target-d7tkp" (UID: "82d43552-0266-40be-b011-548c6b1da18a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:33.323367 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:33.323266 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:33.323515 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:33.323266 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:52:33.323515 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:33.323392 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gc9vp" podUID="55745b2f-83f9-46da-95cf-59aa391f6226" Apr 16 14:52:33.323515 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:33.323504 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7tkp" podUID="82d43552-0266-40be-b011-548c6b1da18a" Apr 16 14:52:34.323182 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:34.323134 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:34.323370 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:34.323283 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j76vn" podUID="38d86a56-d8b6-4bb2-a413-3166ca14717f" Apr 16 14:52:35.323254 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:35.323216 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:35.323254 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:35.323266 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:52:35.323778 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:35.323365 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gc9vp" podUID="55745b2f-83f9-46da-95cf-59aa391f6226" Apr 16 14:52:35.323778 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:35.323488 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7tkp" podUID="82d43552-0266-40be-b011-548c6b1da18a" Apr 16 14:52:35.658025 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:35.658003 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/55745b2f-83f9-46da-95cf-59aa391f6226-original-pull-secret\") pod \"global-pull-secret-syncer-gc9vp\" (UID: \"55745b2f-83f9-46da-95cf-59aa391f6226\") " pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:35.658126 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:35.658114 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:35.658185 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:35.658160 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55745b2f-83f9-46da-95cf-59aa391f6226-original-pull-secret podName:55745b2f-83f9-46da-95cf-59aa391f6226 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:51.658148989 +0000 UTC m=+35.999279822 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/55745b2f-83f9-46da-95cf-59aa391f6226-original-pull-secret") pod "global-pull-secret-syncer-gc9vp" (UID: "55745b2f-83f9-46da-95cf-59aa391f6226") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:36.323852 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:36.323639 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:36.324531 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:36.323932 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j76vn" podUID="38d86a56-d8b6-4bb2-a413-3166ca14717f" Apr 16 14:52:36.468565 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:36.468534 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4krl_84ace2a9-8bcc-47b5-81bb-c764aa280104/ovn-acl-logging/0.log" Apr 16 14:52:36.468892 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:36.468871 2565 generic.go:358] "Generic (PLEG): container finished" podID="84ace2a9-8bcc-47b5-81bb-c764aa280104" containerID="261f4819dae3eb962b07603e8a9ea227825fa0c7e3084b5a591ebacc631f1257" exitCode=1 Apr 16 14:52:36.468979 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:36.468955 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" event={"ID":"84ace2a9-8bcc-47b5-81bb-c764aa280104","Type":"ContainerStarted","Data":"06245dda91c4577481595d3cc55449e4f81da062c6a73781b73d762913ff46db"} Apr 16 14:52:36.469037 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:36.468995 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" event={"ID":"84ace2a9-8bcc-47b5-81bb-c764aa280104","Type":"ContainerStarted","Data":"f78c1bfa6be2a566abe98d809e149ed09e7b80ae4de32f9c23190eb1934fee13"} Apr 16 14:52:36.469037 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:36.469011 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" event={"ID":"84ace2a9-8bcc-47b5-81bb-c764aa280104","Type":"ContainerStarted","Data":"f1eca474f86c7e62c1400c1eb8847972cf158e76a63c4c695f957ef5de131b25"} Apr 16 14:52:36.469037 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:36.469023 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" event={"ID":"84ace2a9-8bcc-47b5-81bb-c764aa280104","Type":"ContainerStarted","Data":"61da5504d28b58830bb59cb644da0322a0e84cc4e8f5b71e01042c94fc8c7888"} Apr 16 14:52:36.469196 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:36.469036 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" event={"ID":"84ace2a9-8bcc-47b5-81bb-c764aa280104","Type":"ContainerDied","Data":"261f4819dae3eb962b07603e8a9ea227825fa0c7e3084b5a591ebacc631f1257"} Apr 16 14:52:36.469196 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:36.469052 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" event={"ID":"84ace2a9-8bcc-47b5-81bb-c764aa280104","Type":"ContainerStarted","Data":"5d0a479e9b044f35999ecf31a9f83b98039ac7d8e7b353349ef3cddcb3f04cff"} Apr 16 14:52:36.470265 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:36.470241 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jf55j" event={"ID":"5bd34617-f0f5-4b74-b464-a7613ad4c7a9","Type":"ContainerStarted","Data":"24a1bab737ce3bb3ce27feb7c0a1378cb83c581e927aafc3b6eea21ff700a761"} Apr 16 14:52:36.472129 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:36.472101 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-cj56j" event={"ID":"c578c137-c4e7-4fd5-8394-2a72b0661d12","Type":"ContainerStarted","Data":"e35cd3079fdc5fcd5717d4b35a4ff1c307b369cbd5228b4df1b65c1d30dba088"} Apr 16 14:52:36.473632 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:36.473606 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-46.ec2.internal" event={"ID":"540800c0848b372f11baec71c30048e7","Type":"ContainerStarted","Data":"98dd15491f7889d3d71b7ecab413608b7a893bceb6766e2cdb98c6eef202c066"} Apr 16 14:52:36.486816 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:36.486757 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jf55j" podStartSLOduration=1.962698611 podStartE2EDuration="20.486734523s" podCreationTimestamp="2026-04-16 14:52:16 +0000 UTC" firstStartedPulling="2026-04-16 14:52:17.512009209 +0000 UTC m=+1.853140042" lastFinishedPulling="2026-04-16 14:52:36.036045119 +0000 UTC m=+20.377175954" observedRunningTime="2026-04-16 14:52:36.48628531 +0000 UTC m=+20.827416165" watchObservedRunningTime="2026-04-16 14:52:36.486734523 +0000 UTC m=+20.827865380" Apr 16 14:52:36.512778 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:36.512734 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-cj56j" podStartSLOduration=2.484666051 podStartE2EDuration="20.512720532s" podCreationTimestamp="2026-04-16 14:52:16 +0000 UTC" firstStartedPulling="2026-04-16 14:52:17.589668842 +0000 UTC m=+1.930799678" lastFinishedPulling="2026-04-16 14:52:35.617723314 +0000 UTC m=+19.958854159" observedRunningTime="2026-04-16 14:52:36.512522425 +0000 UTC m=+20.853653280" watchObservedRunningTime="2026-04-16 14:52:36.512720532 +0000 UTC m=+20.853851386" Apr 16 14:52:36.512955 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:36.512934 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-46.ec2.internal" podStartSLOduration=20.512927993 podStartE2EDuration="20.512927993s" podCreationTimestamp="2026-04-16 14:52:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:52:36.498434554 +0000 UTC m=+20.839565412" watchObservedRunningTime="2026-04-16 14:52:36.512927993 +0000 UTC m=+20.854058848" Apr 16 14:52:37.322924 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:37.322729 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:52:37.323115 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:37.322729 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:37.323115 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:37.323029 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7tkp" podUID="82d43552-0266-40be-b011-548c6b1da18a" Apr 16 14:52:37.323115 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:37.323098 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gc9vp" podUID="55745b2f-83f9-46da-95cf-59aa391f6226" Apr 16 14:52:37.476961 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:37.476926 2565 generic.go:358] "Generic (PLEG): container finished" podID="4c16a30a2810a56bab96c6b9f99d7cfc" containerID="5eb9a150daa5096b912b86b8958a543579759a586590ca9489be902bd797d138" exitCode=0 Apr 16 14:52:37.477461 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:37.477004 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-46.ec2.internal" event={"ID":"4c16a30a2810a56bab96c6b9f99d7cfc","Type":"ContainerDied","Data":"5eb9a150daa5096b912b86b8958a543579759a586590ca9489be902bd797d138"} Apr 16 14:52:37.478470 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:37.478443 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fpnf7" event={"ID":"40f100bd-4ea9-4c4c-bfe7-d00fbe4359f5","Type":"ContainerStarted","Data":"0c2a146a8cf2d66d5e62fb11211cc68b704e6ad0d002a2805bd9a170d516c0cb"} Apr 16 14:52:37.480100 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:37.480073 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5zt6p" event={"ID":"fd96140b-7f7e-4208-9c8d-400e5a881b11","Type":"ContainerStarted","Data":"8d728d9ecd44a715c2379dc33d9ae2524b852170fc6849312b4c8bed1ba3fde8"} Apr 16 14:52:37.481894 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:37.481862 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" event={"ID":"4d508526-4ec1-4ecd-be56-0426bc2e4469","Type":"ContainerStarted","Data":"2004280a0eea0a7b82a0bb0a8f7003b484253196228f4e26b10f81d47eb34fc2"} Apr 16 14:52:37.484014 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:37.483988 2565 generic.go:358] "Generic (PLEG): container finished" podID="5172b522-bc83-41f2-8760-e2fba5340ff1" containerID="6d0c17c09fc813a075ad146e8faa2c2671ffd3be1910e8b6f515fef0c5e4ddcd" exitCode=0 Apr 16 14:52:37.484097 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:37.484070 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4mtb6" event={"ID":"5172b522-bc83-41f2-8760-e2fba5340ff1","Type":"ContainerDied","Data":"6d0c17c09fc813a075ad146e8faa2c2671ffd3be1910e8b6f515fef0c5e4ddcd"} Apr 16 14:52:37.485828 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:37.485804 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f4596" event={"ID":"cdb087ca-e5b6-43aa-88b4-f2d25147cf7e","Type":"ContainerStarted","Data":"063cfd78b3c419bdfe9d9ad1a6111c6bffc57024e398c633fcdd3f40fdd9c392"} Apr 16 14:52:37.487780 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:37.487763 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ldwhr" event={"ID":"233701c2-920c-46da-8d99-9ee0fe62c01a","Type":"ContainerStarted","Data":"d058001b5b786cf0a6fe70c0378bd44ed4be7d41ff03eb3210224a3f2dacab4f"} Apr 16 14:52:37.510643 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:37.510593 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-5zt6p" podStartSLOduration=3.49611205 podStartE2EDuration="21.510575692s" podCreationTimestamp="2026-04-16 14:52:16 +0000 UTC" firstStartedPulling="2026-04-16 14:52:17.601600802 +0000 UTC m=+1.942731635" lastFinishedPulling="2026-04-16 14:52:35.61606443 +0000 UTC m=+19.957195277" observedRunningTime="2026-04-16 14:52:37.510117718 +0000 UTC m=+21.851248574" watchObservedRunningTime="2026-04-16 14:52:37.510575692 +0000 UTC m=+21.851706547" Apr 16 14:52:37.560827 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:37.560711 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-f4596" podStartSLOduration=3.553593865 podStartE2EDuration="21.56069781s" podCreationTimestamp="2026-04-16 14:52:16 +0000 UTC" firstStartedPulling="2026-04-16 14:52:17.609058006 +0000 UTC m=+1.950188851" lastFinishedPulling="2026-04-16 14:52:35.61616196 +0000 UTC m=+19.957292796" observedRunningTime="2026-04-16 14:52:37.56068045 +0000 UTC m=+21.901811317" watchObservedRunningTime="2026-04-16 14:52:37.56069781 +0000 UTC m=+21.901828665" Apr 16 14:52:37.561120 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:37.561095 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-ldwhr" podStartSLOduration=3.500793141 podStartE2EDuration="21.561088327s" podCreationTimestamp="2026-04-16 14:52:16 +0000 UTC" firstStartedPulling="2026-04-16 14:52:17.55575326 +0000 UTC m=+1.896884093" lastFinishedPulling="2026-04-16 14:52:35.616048441 +0000 UTC m=+19.957179279" observedRunningTime="2026-04-16 14:52:37.548519605 +0000 UTC m=+21.889650461" watchObservedRunningTime="2026-04-16 14:52:37.561088327 +0000 UTC m=+21.902219182" Apr 16 14:52:37.567379 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:37.567360 2565 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 14:52:37.576226 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:37.576141 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fpnf7" podStartSLOduration=3.57384877 podStartE2EDuration="21.576130045s" podCreationTimestamp="2026-04-16 14:52:16 +0000 UTC" firstStartedPulling="2026-04-16 14:52:17.613920059 +0000 UTC m=+1.955050891" lastFinishedPulling="2026-04-16 14:52:35.616201331 +0000 UTC m=+19.957332166" observedRunningTime="2026-04-16 14:52:37.575958714 +0000 UTC m=+21.917089570" watchObservedRunningTime="2026-04-16 14:52:37.576130045 +0000 UTC m=+21.917260900" Apr 16 14:52:38.231763 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:38.231723 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-ldwhr" Apr 16 14:52:38.232538 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:38.232494 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-ldwhr" Apr 16 14:52:38.276199 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:38.276064 2565 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T14:52:37.567374792Z","UUID":"016a9bdd-0cc0-4115-a70f-24d4f7a4b46b","Handler":null,"Name":"","Endpoint":""} Apr 16 14:52:38.278483 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:38.278459 2565 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 14:52:38.278483 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:38.278490 2565 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 14:52:38.322997 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:38.322654 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:38.322997 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:38.322801 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j76vn" podUID="38d86a56-d8b6-4bb2-a413-3166ca14717f" Apr 16 14:52:38.491373 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:38.491106 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" event={"ID":"4d508526-4ec1-4ecd-be56-0426bc2e4469","Type":"ContainerStarted","Data":"82d383f3abd3f84bbd3fd7acdfeab347f66e3322dcc9bfbd8706326e20dec36f"} Apr 16 14:52:38.494302 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:38.494280 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4krl_84ace2a9-8bcc-47b5-81bb-c764aa280104/ovn-acl-logging/0.log" Apr 16 14:52:38.494906 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:38.494877 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" event={"ID":"84ace2a9-8bcc-47b5-81bb-c764aa280104","Type":"ContainerStarted","Data":"f30a89c9e53ce68ee19e2b2ebc4431cebaec64db02d2cc7bcb8b1fe64bae8852"} Apr 16 14:52:38.496949 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:38.496923 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-46.ec2.internal" event={"ID":"4c16a30a2810a56bab96c6b9f99d7cfc","Type":"ContainerStarted","Data":"4652ed84bea5996608c76c29733170858d2610e5c4b4a0007363bda1c47913de"} Apr 16 14:52:38.497752 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:38.497734 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-ldwhr" Apr 16 14:52:38.498378 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:38.498357 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-ldwhr" Apr 16 14:52:38.510631 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:38.510583 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-46.ec2.internal" podStartSLOduration=22.510568787 podStartE2EDuration="22.510568787s" podCreationTimestamp="2026-04-16 14:52:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:52:38.510187236 +0000 UTC m=+22.851318083" watchObservedRunningTime="2026-04-16 14:52:38.510568787 +0000 UTC m=+22.851699644" Apr 16 14:52:39.322894 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:39.322825 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:39.322894 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:39.322850 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:52:39.323194 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:39.322964 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gc9vp" podUID="55745b2f-83f9-46da-95cf-59aa391f6226" Apr 16 14:52:39.323194 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:39.323097 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7tkp" podUID="82d43552-0266-40be-b011-548c6b1da18a" Apr 16 14:52:39.501507 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:39.501467 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" event={"ID":"4d508526-4ec1-4ecd-be56-0426bc2e4469","Type":"ContainerStarted","Data":"4f040674c7622eea84071a654a009147ffd7c0bccd5c952678cd174f21aaecbd"} Apr 16 14:52:39.516804 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:39.516752 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pbdzr" podStartSLOduration=2.6919829870000003 podStartE2EDuration="23.516734246s" podCreationTimestamp="2026-04-16 14:52:16 +0000 UTC" firstStartedPulling="2026-04-16 14:52:17.572540169 +0000 UTC m=+1.913671006" lastFinishedPulling="2026-04-16 14:52:38.39729142 +0000 UTC m=+22.738422265" observedRunningTime="2026-04-16 14:52:39.516518907 +0000 UTC m=+23.857649774" watchObservedRunningTime="2026-04-16 14:52:39.516734246 +0000 UTC m=+23.857865101" Apr 16 14:52:40.323299 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:40.323253 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:40.323487 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:40.323413 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j76vn" podUID="38d86a56-d8b6-4bb2-a413-3166ca14717f" Apr 16 14:52:41.322555 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:41.322523 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:41.323268 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:41.322523 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:52:41.323268 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:41.322637 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gc9vp" podUID="55745b2f-83f9-46da-95cf-59aa391f6226" Apr 16 14:52:41.323268 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:41.322690 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7tkp" podUID="82d43552-0266-40be-b011-548c6b1da18a" Apr 16 14:52:42.323384 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:42.323221 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:42.323924 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:42.323459 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j76vn" podUID="38d86a56-d8b6-4bb2-a413-3166ca14717f" Apr 16 14:52:42.509825 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:42.509797 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4krl_84ace2a9-8bcc-47b5-81bb-c764aa280104/ovn-acl-logging/0.log" Apr 16 14:52:42.510140 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:42.510116 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" event={"ID":"84ace2a9-8bcc-47b5-81bb-c764aa280104","Type":"ContainerStarted","Data":"cf45e34797baf9fc8659ab80598b469235679a7a4f36fcbf22831faf47e5578e"} Apr 16 14:52:42.510483 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:42.510462 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:42.510654 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:42.510639 2565 scope.go:117] "RemoveContainer" containerID="261f4819dae3eb962b07603e8a9ea227825fa0c7e3084b5a591ebacc631f1257" Apr 16 14:52:42.511839 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:42.511812 2565 generic.go:358] "Generic (PLEG): container finished" podID="5172b522-bc83-41f2-8760-e2fba5340ff1" containerID="42ea977238176c3ebb67be2de8e7b2bca06750b6da03d732df8fa5fbe497e91d" exitCode=0 Apr 16 14:52:42.511901 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:42.511864 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4mtb6" event={"ID":"5172b522-bc83-41f2-8760-e2fba5340ff1","Type":"ContainerDied","Data":"42ea977238176c3ebb67be2de8e7b2bca06750b6da03d732df8fa5fbe497e91d"} Apr 16 14:52:42.526322 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:42.526302 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:42.980668 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:42.980634 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:43.322536 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:43.322380 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:43.322649 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:43.322399 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:52:43.322707 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:43.322646 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gc9vp" podUID="55745b2f-83f9-46da-95cf-59aa391f6226" Apr 16 14:52:43.322773 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:43.322727 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7tkp" podUID="82d43552-0266-40be-b011-548c6b1da18a" Apr 16 14:52:43.516831 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:43.516803 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4krl_84ace2a9-8bcc-47b5-81bb-c764aa280104/ovn-acl-logging/0.log" Apr 16 14:52:43.517249 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:43.517140 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" event={"ID":"84ace2a9-8bcc-47b5-81bb-c764aa280104","Type":"ContainerStarted","Data":"16bde96f34b824cf1f6382b2e8c95f66b6881cffc8d3cc559a8baba4fec17c2c"} Apr 16 14:52:43.517482 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:43.517465 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:43.519122 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:43.519100 2565 generic.go:358] "Generic (PLEG): container finished" podID="5172b522-bc83-41f2-8760-e2fba5340ff1" containerID="a64e51066309a3d840fd6b2dc0cec5423373552a7246007a1dfb81c602bbba52" exitCode=0 Apr 16 14:52:43.519227 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:43.519136 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4mtb6" event={"ID":"5172b522-bc83-41f2-8760-e2fba5340ff1","Type":"ContainerDied","Data":"a64e51066309a3d840fd6b2dc0cec5423373552a7246007a1dfb81c602bbba52"} Apr 16 14:52:43.534549 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:43.534523 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gc9vp"] Apr 16 14:52:43.534664 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:43.534622 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:43.534754 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:43.534721 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gc9vp" podUID="55745b2f-83f9-46da-95cf-59aa391f6226" Apr 16 14:52:43.534878 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:43.534860 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:52:43.538215 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:43.538191 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-j76vn"] Apr 16 14:52:43.538320 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:43.538310 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:43.538435 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:43.538417 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j76vn" podUID="38d86a56-d8b6-4bb2-a413-3166ca14717f" Apr 16 14:52:43.538798 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:43.538780 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-d7tkp"] Apr 16 14:52:43.538879 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:43.538846 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:52:43.538938 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:43.538917 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7tkp" podUID="82d43552-0266-40be-b011-548c6b1da18a" Apr 16 14:52:43.543818 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:43.543775 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" podStartSLOduration=9.248299163 podStartE2EDuration="27.543760697s" podCreationTimestamp="2026-04-16 14:52:16 +0000 UTC" firstStartedPulling="2026-04-16 14:52:17.536585338 +0000 UTC m=+1.877716170" lastFinishedPulling="2026-04-16 14:52:35.83204687 +0000 UTC m=+20.173177704" observedRunningTime="2026-04-16 14:52:43.542708509 +0000 UTC m=+27.883839484" watchObservedRunningTime="2026-04-16 14:52:43.543760697 +0000 UTC m=+27.884891554" Apr 16 14:52:44.523248 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:44.523207 2565 generic.go:358] "Generic (PLEG): container finished" podID="5172b522-bc83-41f2-8760-e2fba5340ff1" containerID="8ff51e4dfc5980f2400ca84ca4914f0e19b2c8d1473f12dccf73509a1407e2bf" exitCode=0 Apr 16 14:52:44.523698 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:44.523294 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4mtb6" event={"ID":"5172b522-bc83-41f2-8760-e2fba5340ff1","Type":"ContainerDied","Data":"8ff51e4dfc5980f2400ca84ca4914f0e19b2c8d1473f12dccf73509a1407e2bf"} Apr 16 14:52:45.322921 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:45.322843 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:45.323091 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:45.322843 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:52:45.323091 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:45.322972 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gc9vp" podUID="55745b2f-83f9-46da-95cf-59aa391f6226" Apr 16 14:52:45.323091 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:45.322842 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:45.323091 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:45.323045 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7tkp" podUID="82d43552-0266-40be-b011-548c6b1da18a" Apr 16 14:52:45.323360 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:45.323135 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j76vn" podUID="38d86a56-d8b6-4bb2-a413-3166ca14717f" Apr 16 14:52:47.322770 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:47.322736 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:52:47.323463 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:47.322736 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:47.323463 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:47.322849 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d7tkp" podUID="82d43552-0266-40be-b011-548c6b1da18a" Apr 16 14:52:47.323463 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:47.322939 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j76vn" podUID="38d86a56-d8b6-4bb2-a413-3166ca14717f" Apr 16 14:52:47.323463 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:47.322736 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:47.323463 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:47.323016 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gc9vp" podUID="55745b2f-83f9-46da-95cf-59aa391f6226" Apr 16 14:52:48.935445 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:48.935413 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-46.ec2.internal" event="NodeReady" Apr 16 14:52:48.935840 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:48.935573 2565 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 14:52:48.957442 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:48.957408 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpsk8\" (UniqueName: \"kubernetes.io/projected/82d43552-0266-40be-b011-548c6b1da18a-kube-api-access-xpsk8\") pod \"network-check-target-d7tkp\" (UID: \"82d43552-0266-40be-b011-548c6b1da18a\") " pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:52:48.957609 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:48.957550 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:48.957609 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:48.957560 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs\") pod \"network-metrics-daemon-j76vn\" (UID: \"38d86a56-d8b6-4bb2-a413-3166ca14717f\") " pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:48.957609 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:48.957575 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:48.957609 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:48.957585 2565 projected.go:194] Error preparing data for projected volume kube-api-access-xpsk8 for pod openshift-network-diagnostics/network-check-target-d7tkp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:48.957820 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:48.957635 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82d43552-0266-40be-b011-548c6b1da18a-kube-api-access-xpsk8 podName:82d43552-0266-40be-b011-548c6b1da18a nodeName:}" failed. No retries permitted until 2026-04-16 14:53:20.95761856 +0000 UTC m=+65.298749418 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-xpsk8" (UniqueName: "kubernetes.io/projected/82d43552-0266-40be-b011-548c6b1da18a-kube-api-access-xpsk8") pod "network-check-target-d7tkp" (UID: "82d43552-0266-40be-b011-548c6b1da18a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:48.957820 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:48.957685 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:48.957820 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:48.957737 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs podName:38d86a56-d8b6-4bb2-a413-3166ca14717f nodeName:}" failed. No retries permitted until 2026-04-16 14:53:20.957719985 +0000 UTC m=+65.298850825 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs") pod "network-metrics-daemon-j76vn" (UID: "38d86a56-d8b6-4bb2-a413-3166ca14717f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:48.978469 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:48.978441 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4qbwg"] Apr 16 14:52:49.008196 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.008155 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-m9hrv"] Apr 16 14:52:49.008428 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.008310 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4qbwg" Apr 16 14:52:49.010931 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.010908 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 14:52:49.011064 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.010997 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 14:52:49.011064 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.011035 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gcjjc\"" Apr 16 14:52:49.011064 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.011038 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 14:52:49.022502 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.022479 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4qbwg"] Apr 16 14:52:49.022647 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.022509 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-m9hrv"] Apr 16 14:52:49.022647 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.022639 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-m9hrv" Apr 16 14:52:49.025134 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.025026 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 14:52:49.025134 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.025048 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-njff8\"" Apr 16 14:52:49.025134 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.025099 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 14:52:49.158800 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.158759 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/927790a5-7672-4d93-a725-5924ae587d09-metrics-tls\") pod \"dns-default-m9hrv\" (UID: \"927790a5-7672-4d93-a725-5924ae587d09\") " pod="openshift-dns/dns-default-m9hrv" Apr 16 14:52:49.158800 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.158803 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a14533b-b916-4308-8775-7107db9fe6de-cert\") pod \"ingress-canary-4qbwg\" (UID: \"9a14533b-b916-4308-8775-7107db9fe6de\") " pod="openshift-ingress-canary/ingress-canary-4qbwg" Apr 16 14:52:49.159037 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.158840 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/927790a5-7672-4d93-a725-5924ae587d09-tmp-dir\") pod \"dns-default-m9hrv\" (UID: \"927790a5-7672-4d93-a725-5924ae587d09\") " pod="openshift-dns/dns-default-m9hrv" Apr 16 14:52:49.159037 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.158897 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt2zw\" (UniqueName: \"kubernetes.io/projected/9a14533b-b916-4308-8775-7107db9fe6de-kube-api-access-wt2zw\") pod \"ingress-canary-4qbwg\" (UID: \"9a14533b-b916-4308-8775-7107db9fe6de\") " pod="openshift-ingress-canary/ingress-canary-4qbwg" Apr 16 14:52:49.159037 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.158925 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/927790a5-7672-4d93-a725-5924ae587d09-config-volume\") pod \"dns-default-m9hrv\" (UID: \"927790a5-7672-4d93-a725-5924ae587d09\") " pod="openshift-dns/dns-default-m9hrv" Apr 16 14:52:49.159037 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.158950 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vnsx\" (UniqueName: \"kubernetes.io/projected/927790a5-7672-4d93-a725-5924ae587d09-kube-api-access-2vnsx\") pod \"dns-default-m9hrv\" (UID: \"927790a5-7672-4d93-a725-5924ae587d09\") " pod="openshift-dns/dns-default-m9hrv" Apr 16 14:52:49.260322 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.260225 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/927790a5-7672-4d93-a725-5924ae587d09-tmp-dir\") pod \"dns-default-m9hrv\" (UID: \"927790a5-7672-4d93-a725-5924ae587d09\") " pod="openshift-dns/dns-default-m9hrv" Apr 16 14:52:49.260494 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.260357 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wt2zw\" (UniqueName: \"kubernetes.io/projected/9a14533b-b916-4308-8775-7107db9fe6de-kube-api-access-wt2zw\") pod \"ingress-canary-4qbwg\" (UID: \"9a14533b-b916-4308-8775-7107db9fe6de\") " pod="openshift-ingress-canary/ingress-canary-4qbwg" Apr 16 14:52:49.260494 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.260380 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/927790a5-7672-4d93-a725-5924ae587d09-config-volume\") pod \"dns-default-m9hrv\" (UID: \"927790a5-7672-4d93-a725-5924ae587d09\") " pod="openshift-dns/dns-default-m9hrv" Apr 16 14:52:49.260494 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.260397 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vnsx\" (UniqueName: \"kubernetes.io/projected/927790a5-7672-4d93-a725-5924ae587d09-kube-api-access-2vnsx\") pod \"dns-default-m9hrv\" (UID: \"927790a5-7672-4d93-a725-5924ae587d09\") " pod="openshift-dns/dns-default-m9hrv" Apr 16 14:52:49.260494 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.260450 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/927790a5-7672-4d93-a725-5924ae587d09-metrics-tls\") pod \"dns-default-m9hrv\" (UID: \"927790a5-7672-4d93-a725-5924ae587d09\") " pod="openshift-dns/dns-default-m9hrv" Apr 16 14:52:49.260494 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.260474 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a14533b-b916-4308-8775-7107db9fe6de-cert\") pod \"ingress-canary-4qbwg\" (UID: \"9a14533b-b916-4308-8775-7107db9fe6de\") " pod="openshift-ingress-canary/ingress-canary-4qbwg" Apr 16 14:52:49.260720 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:49.260572 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:52:49.260720 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:49.260647 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a14533b-b916-4308-8775-7107db9fe6de-cert podName:9a14533b-b916-4308-8775-7107db9fe6de nodeName:}" failed. No retries permitted until 2026-04-16 14:52:49.760627566 +0000 UTC m=+34.101758399 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9a14533b-b916-4308-8775-7107db9fe6de-cert") pod "ingress-canary-4qbwg" (UID: "9a14533b-b916-4308-8775-7107db9fe6de") : secret "canary-serving-cert" not found Apr 16 14:52:49.260720 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.260646 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/927790a5-7672-4d93-a725-5924ae587d09-tmp-dir\") pod \"dns-default-m9hrv\" (UID: \"927790a5-7672-4d93-a725-5924ae587d09\") " pod="openshift-dns/dns-default-m9hrv" Apr 16 14:52:49.260720 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:49.260697 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:52:49.260935 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:49.260760 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/927790a5-7672-4d93-a725-5924ae587d09-metrics-tls podName:927790a5-7672-4d93-a725-5924ae587d09 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:49.76074412 +0000 UTC m=+34.101874956 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/927790a5-7672-4d93-a725-5924ae587d09-metrics-tls") pod "dns-default-m9hrv" (UID: "927790a5-7672-4d93-a725-5924ae587d09") : secret "dns-default-metrics-tls" not found Apr 16 14:52:49.261099 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.261078 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/927790a5-7672-4d93-a725-5924ae587d09-config-volume\") pod \"dns-default-m9hrv\" (UID: \"927790a5-7672-4d93-a725-5924ae587d09\") " pod="openshift-dns/dns-default-m9hrv" Apr 16 14:52:49.271412 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.271247 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vnsx\" (UniqueName: \"kubernetes.io/projected/927790a5-7672-4d93-a725-5924ae587d09-kube-api-access-2vnsx\") pod \"dns-default-m9hrv\" (UID: \"927790a5-7672-4d93-a725-5924ae587d09\") " pod="openshift-dns/dns-default-m9hrv" Apr 16 14:52:49.271569 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.271318 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt2zw\" (UniqueName: \"kubernetes.io/projected/9a14533b-b916-4308-8775-7107db9fe6de-kube-api-access-wt2zw\") pod \"ingress-canary-4qbwg\" (UID: \"9a14533b-b916-4308-8775-7107db9fe6de\") " pod="openshift-ingress-canary/ingress-canary-4qbwg" Apr 16 14:52:49.322543 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.322501 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:52:49.322717 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.322616 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:52:49.322781 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.322738 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:49.325161 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.324979 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:52:49.325161 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.325076 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hst6r\"" Apr 16 14:52:49.325161 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.325088 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dpmcp\"" Apr 16 14:52:49.325161 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.325106 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 14:52:49.325770 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.325738 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:52:49.325871 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.325776 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:52:49.763755 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.763724 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/927790a5-7672-4d93-a725-5924ae587d09-metrics-tls\") pod \"dns-default-m9hrv\" (UID: \"927790a5-7672-4d93-a725-5924ae587d09\") " pod="openshift-dns/dns-default-m9hrv" Apr 16 14:52:49.763755 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:49.763757 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a14533b-b916-4308-8775-7107db9fe6de-cert\") pod \"ingress-canary-4qbwg\" (UID: \"9a14533b-b916-4308-8775-7107db9fe6de\") " pod="openshift-ingress-canary/ingress-canary-4qbwg" Apr 16 14:52:49.764100 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:49.763877 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:52:49.764100 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:49.763884 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:52:49.764100 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:49.763949 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a14533b-b916-4308-8775-7107db9fe6de-cert podName:9a14533b-b916-4308-8775-7107db9fe6de nodeName:}" failed. No retries permitted until 2026-04-16 14:52:50.763933835 +0000 UTC m=+35.105064668 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9a14533b-b916-4308-8775-7107db9fe6de-cert") pod "ingress-canary-4qbwg" (UID: "9a14533b-b916-4308-8775-7107db9fe6de") : secret "canary-serving-cert" not found Apr 16 14:52:49.764100 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:49.763963 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/927790a5-7672-4d93-a725-5924ae587d09-metrics-tls podName:927790a5-7672-4d93-a725-5924ae587d09 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:50.763957526 +0000 UTC m=+35.105088359 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/927790a5-7672-4d93-a725-5924ae587d09-metrics-tls") pod "dns-default-m9hrv" (UID: "927790a5-7672-4d93-a725-5924ae587d09") : secret "dns-default-metrics-tls" not found Apr 16 14:52:50.770411 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:50.770370 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/927790a5-7672-4d93-a725-5924ae587d09-metrics-tls\") pod \"dns-default-m9hrv\" (UID: \"927790a5-7672-4d93-a725-5924ae587d09\") " pod="openshift-dns/dns-default-m9hrv" Apr 16 14:52:50.770867 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:50.770422 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a14533b-b916-4308-8775-7107db9fe6de-cert\") pod \"ingress-canary-4qbwg\" (UID: \"9a14533b-b916-4308-8775-7107db9fe6de\") " pod="openshift-ingress-canary/ingress-canary-4qbwg" Apr 16 14:52:50.770867 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:50.770512 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:52:50.770867 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:50.770568 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:52:50.770867 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:50.770587 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/927790a5-7672-4d93-a725-5924ae587d09-metrics-tls podName:927790a5-7672-4d93-a725-5924ae587d09 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:52.770560185 +0000 UTC m=+37.111691018 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/927790a5-7672-4d93-a725-5924ae587d09-metrics-tls") pod "dns-default-m9hrv" (UID: "927790a5-7672-4d93-a725-5924ae587d09") : secret "dns-default-metrics-tls" not found Apr 16 14:52:50.770867 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:50.770629 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a14533b-b916-4308-8775-7107db9fe6de-cert podName:9a14533b-b916-4308-8775-7107db9fe6de nodeName:}" failed. No retries permitted until 2026-04-16 14:52:52.770610552 +0000 UTC m=+37.111741385 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9a14533b-b916-4308-8775-7107db9fe6de-cert") pod "ingress-canary-4qbwg" (UID: "9a14533b-b916-4308-8775-7107db9fe6de") : secret "canary-serving-cert" not found Apr 16 14:52:51.538905 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:51.538867 2565 generic.go:358] "Generic (PLEG): container finished" podID="5172b522-bc83-41f2-8760-e2fba5340ff1" containerID="b7e4ddc07b122f3675a3c01c3803ec0d5a1ee61d07ebc5f57a489fe9b543e9af" exitCode=0 Apr 16 14:52:51.538905 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:51.538914 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4mtb6" event={"ID":"5172b522-bc83-41f2-8760-e2fba5340ff1","Type":"ContainerDied","Data":"b7e4ddc07b122f3675a3c01c3803ec0d5a1ee61d07ebc5f57a489fe9b543e9af"} Apr 16 14:52:51.679113 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:51.679090 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/55745b2f-83f9-46da-95cf-59aa391f6226-original-pull-secret\") pod \"global-pull-secret-syncer-gc9vp\" (UID: \"55745b2f-83f9-46da-95cf-59aa391f6226\") " pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:51.681977 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:51.681957 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/55745b2f-83f9-46da-95cf-59aa391f6226-original-pull-secret\") pod \"global-pull-secret-syncer-gc9vp\" (UID: \"55745b2f-83f9-46da-95cf-59aa391f6226\") " pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:51.746836 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:51.746802 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gc9vp" Apr 16 14:52:51.898369 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:51.898340 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gc9vp"] Apr 16 14:52:51.906127 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:51.906101 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55745b2f_83f9_46da_95cf_59aa391f6226.slice/crio-40bbcfbce27ed474fcffe6684c0cfdf7f4ddfe0e33595b4ed8c3924d83f694ed WatchSource:0}: Error finding container 40bbcfbce27ed474fcffe6684c0cfdf7f4ddfe0e33595b4ed8c3924d83f694ed: Status 404 returned error can't find the container with id 40bbcfbce27ed474fcffe6684c0cfdf7f4ddfe0e33595b4ed8c3924d83f694ed Apr 16 14:52:52.542905 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:52.542660 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gc9vp" event={"ID":"55745b2f-83f9-46da-95cf-59aa391f6226","Type":"ContainerStarted","Data":"40bbcfbce27ed474fcffe6684c0cfdf7f4ddfe0e33595b4ed8c3924d83f694ed"} Apr 16 14:52:52.546241 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:52.546211 2565 generic.go:358] "Generic (PLEG): container finished" podID="5172b522-bc83-41f2-8760-e2fba5340ff1" containerID="6a743b3949fb4906916c1e9993de7dfd39d6ff3445674e4ab04c463b060bc910" exitCode=0 Apr 16 14:52:52.546397 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:52.546286 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4mtb6" event={"ID":"5172b522-bc83-41f2-8760-e2fba5340ff1","Type":"ContainerDied","Data":"6a743b3949fb4906916c1e9993de7dfd39d6ff3445674e4ab04c463b060bc910"} Apr 16 14:52:52.787277 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:52.787237 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/927790a5-7672-4d93-a725-5924ae587d09-metrics-tls\") pod \"dns-default-m9hrv\" (UID: \"927790a5-7672-4d93-a725-5924ae587d09\") " pod="openshift-dns/dns-default-m9hrv" Apr 16 14:52:52.787277 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:52.787283 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a14533b-b916-4308-8775-7107db9fe6de-cert\") pod \"ingress-canary-4qbwg\" (UID: \"9a14533b-b916-4308-8775-7107db9fe6de\") " pod="openshift-ingress-canary/ingress-canary-4qbwg" Apr 16 14:52:52.787518 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:52.787385 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:52:52.787518 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:52.787393 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:52:52.787518 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:52.787440 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a14533b-b916-4308-8775-7107db9fe6de-cert podName:9a14533b-b916-4308-8775-7107db9fe6de nodeName:}" failed. No retries permitted until 2026-04-16 14:52:56.787420583 +0000 UTC m=+41.128551433 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9a14533b-b916-4308-8775-7107db9fe6de-cert") pod "ingress-canary-4qbwg" (UID: "9a14533b-b916-4308-8775-7107db9fe6de") : secret "canary-serving-cert" not found Apr 16 14:52:52.787666 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:52.787530 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/927790a5-7672-4d93-a725-5924ae587d09-metrics-tls podName:927790a5-7672-4d93-a725-5924ae587d09 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:56.787522414 +0000 UTC m=+41.128653247 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/927790a5-7672-4d93-a725-5924ae587d09-metrics-tls") pod "dns-default-m9hrv" (UID: "927790a5-7672-4d93-a725-5924ae587d09") : secret "dns-default-metrics-tls" not found Apr 16 14:52:53.551349 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:53.551317 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4mtb6" event={"ID":"5172b522-bc83-41f2-8760-e2fba5340ff1","Type":"ContainerStarted","Data":"51984cf02f7ce9946e6fbe8f6c189246c6cd5a33cbea7902612bba5d1d5baac0"} Apr 16 14:52:53.574550 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:53.574478 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4mtb6" podStartSLOduration=4.792414855 podStartE2EDuration="37.574458656s" podCreationTimestamp="2026-04-16 14:52:16 +0000 UTC" firstStartedPulling="2026-04-16 14:52:17.620766254 +0000 UTC m=+1.961897087" lastFinishedPulling="2026-04-16 14:52:50.402810055 +0000 UTC m=+34.743940888" observedRunningTime="2026-04-16 14:52:53.574003433 +0000 UTC m=+37.915134316" watchObservedRunningTime="2026-04-16 14:52:53.574458656 +0000 UTC m=+37.915589515" Apr 16 14:52:53.944827 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:53.944759 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fd5c8b56f-d4x95"] Apr 16 14:52:53.965865 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:53.965836 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fd5c8b56f-d4x95"] Apr 16 14:52:53.966039 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:53.965993 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fd5c8b56f-d4x95" Apr 16 14:52:53.969443 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:53.969416 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 14:52:53.969584 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:53.969422 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 14:52:53.969875 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:53.969856 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 14:52:53.970272 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:53.970254 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-vbfv9\"" Apr 16 14:52:53.970481 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:53.970464 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 14:52:53.976391 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:53.976371 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb"] Apr 16 14:52:54.012624 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.000955 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb"] Apr 16 14:52:54.012624 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.001101 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" Apr 16 14:52:54.013047 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.013022 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 14:52:54.013564 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.013343 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 14:52:54.013564 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.013416 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 14:52:54.013771 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.013730 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 14:52:54.115036 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.114997 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f23c413d-24a7-4f72-9a36-fdce46324970-ca\") pod \"cluster-proxy-proxy-agent-b788f87b4-5xvqb\" (UID: \"f23c413d-24a7-4f72-9a36-fdce46324970\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" Apr 16 14:52:54.115207 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.115050 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cngtv\" (UniqueName: \"kubernetes.io/projected/f23c413d-24a7-4f72-9a36-fdce46324970-kube-api-access-cngtv\") pod \"cluster-proxy-proxy-agent-b788f87b4-5xvqb\" (UID: \"f23c413d-24a7-4f72-9a36-fdce46324970\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" Apr 16 14:52:54.115207 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.115077 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4d325b53-ff81-45c5-9544-c9d753efe187-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7fd5c8b56f-d4x95\" (UID: \"4d325b53-ff81-45c5-9544-c9d753efe187\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fd5c8b56f-d4x95" Apr 16 14:52:54.115207 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.115095 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f23c413d-24a7-4f72-9a36-fdce46324970-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-b788f87b4-5xvqb\" (UID: \"f23c413d-24a7-4f72-9a36-fdce46324970\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" Apr 16 14:52:54.115207 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.115113 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f23c413d-24a7-4f72-9a36-fdce46324970-hub\") pod \"cluster-proxy-proxy-agent-b788f87b4-5xvqb\" (UID: \"f23c413d-24a7-4f72-9a36-fdce46324970\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" Apr 16 14:52:54.115207 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.115135 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f23c413d-24a7-4f72-9a36-fdce46324970-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-b788f87b4-5xvqb\" (UID: \"f23c413d-24a7-4f72-9a36-fdce46324970\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" Apr 16 14:52:54.115388 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.115229 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f23c413d-24a7-4f72-9a36-fdce46324970-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-b788f87b4-5xvqb\" (UID: \"f23c413d-24a7-4f72-9a36-fdce46324970\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" Apr 16 14:52:54.115388 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.115263 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr5bc\" (UniqueName: \"kubernetes.io/projected/4d325b53-ff81-45c5-9544-c9d753efe187-kube-api-access-rr5bc\") pod \"managed-serviceaccount-addon-agent-7fd5c8b56f-d4x95\" (UID: \"4d325b53-ff81-45c5-9544-c9d753efe187\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fd5c8b56f-d4x95" Apr 16 14:52:54.216324 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.216220 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f23c413d-24a7-4f72-9a36-fdce46324970-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-b788f87b4-5xvqb\" (UID: \"f23c413d-24a7-4f72-9a36-fdce46324970\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" Apr 16 14:52:54.216324 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.216298 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rr5bc\" (UniqueName: \"kubernetes.io/projected/4d325b53-ff81-45c5-9544-c9d753efe187-kube-api-access-rr5bc\") pod \"managed-serviceaccount-addon-agent-7fd5c8b56f-d4x95\" (UID: \"4d325b53-ff81-45c5-9544-c9d753efe187\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fd5c8b56f-d4x95" Apr 16 14:52:54.216567 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.216339 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f23c413d-24a7-4f72-9a36-fdce46324970-ca\") pod \"cluster-proxy-proxy-agent-b788f87b4-5xvqb\" (UID: \"f23c413d-24a7-4f72-9a36-fdce46324970\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" Apr 16 14:52:54.216567 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.216384 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cngtv\" (UniqueName: \"kubernetes.io/projected/f23c413d-24a7-4f72-9a36-fdce46324970-kube-api-access-cngtv\") pod \"cluster-proxy-proxy-agent-b788f87b4-5xvqb\" (UID: \"f23c413d-24a7-4f72-9a36-fdce46324970\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" Apr 16 14:52:54.216567 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.216422 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4d325b53-ff81-45c5-9544-c9d753efe187-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7fd5c8b56f-d4x95\" (UID: \"4d325b53-ff81-45c5-9544-c9d753efe187\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fd5c8b56f-d4x95" Apr 16 14:52:54.216567 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.216515 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f23c413d-24a7-4f72-9a36-fdce46324970-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-b788f87b4-5xvqb\" (UID: \"f23c413d-24a7-4f72-9a36-fdce46324970\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" Apr 16 14:52:54.216567 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.216548 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f23c413d-24a7-4f72-9a36-fdce46324970-hub\") pod \"cluster-proxy-proxy-agent-b788f87b4-5xvqb\" (UID: \"f23c413d-24a7-4f72-9a36-fdce46324970\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" Apr 16 14:52:54.216815 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.216606 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f23c413d-24a7-4f72-9a36-fdce46324970-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-b788f87b4-5xvqb\" (UID: \"f23c413d-24a7-4f72-9a36-fdce46324970\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" Apr 16 14:52:54.217418 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.217393 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f23c413d-24a7-4f72-9a36-fdce46324970-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-b788f87b4-5xvqb\" (UID: \"f23c413d-24a7-4f72-9a36-fdce46324970\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" Apr 16 14:52:54.220779 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.220753 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f23c413d-24a7-4f72-9a36-fdce46324970-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-b788f87b4-5xvqb\" (UID: \"f23c413d-24a7-4f72-9a36-fdce46324970\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" Apr 16 14:52:54.220895 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.220820 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4d325b53-ff81-45c5-9544-c9d753efe187-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7fd5c8b56f-d4x95\" (UID: \"4d325b53-ff81-45c5-9544-c9d753efe187\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fd5c8b56f-d4x95" Apr 16 14:52:54.221260 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.221243 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f23c413d-24a7-4f72-9a36-fdce46324970-ca\") pod \"cluster-proxy-proxy-agent-b788f87b4-5xvqb\" (UID: \"f23c413d-24a7-4f72-9a36-fdce46324970\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" Apr 16 14:52:54.221388 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.221368 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f23c413d-24a7-4f72-9a36-fdce46324970-hub\") pod \"cluster-proxy-proxy-agent-b788f87b4-5xvqb\" (UID: \"f23c413d-24a7-4f72-9a36-fdce46324970\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" Apr 16 14:52:54.221428 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.221370 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f23c413d-24a7-4f72-9a36-fdce46324970-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-b788f87b4-5xvqb\" (UID: \"f23c413d-24a7-4f72-9a36-fdce46324970\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" Apr 16 14:52:54.225099 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.225080 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr5bc\" (UniqueName: \"kubernetes.io/projected/4d325b53-ff81-45c5-9544-c9d753efe187-kube-api-access-rr5bc\") pod \"managed-serviceaccount-addon-agent-7fd5c8b56f-d4x95\" (UID: \"4d325b53-ff81-45c5-9544-c9d753efe187\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fd5c8b56f-d4x95" Apr 16 14:52:54.225568 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.225550 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cngtv\" (UniqueName: \"kubernetes.io/projected/f23c413d-24a7-4f72-9a36-fdce46324970-kube-api-access-cngtv\") pod \"cluster-proxy-proxy-agent-b788f87b4-5xvqb\" (UID: \"f23c413d-24a7-4f72-9a36-fdce46324970\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" Apr 16 14:52:54.286632 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.286593 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fd5c8b56f-d4x95" Apr 16 14:52:54.320644 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:54.320602 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" Apr 16 14:52:55.625888 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:55.625858 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb"] Apr 16 14:52:55.629146 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:55.629117 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf23c413d_24a7_4f72_9a36_fdce46324970.slice/crio-44331ebeba8f01a0225420949739116ee5b97636d438c3499c61f6a9f9cda230 WatchSource:0}: Error finding container 44331ebeba8f01a0225420949739116ee5b97636d438c3499c61f6a9f9cda230: Status 404 returned error can't find the container with id 44331ebeba8f01a0225420949739116ee5b97636d438c3499c61f6a9f9cda230 Apr 16 14:52:55.637994 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:55.637971 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fd5c8b56f-d4x95"] Apr 16 14:52:55.641440 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:52:55.641413 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d325b53_ff81_45c5_9544_c9d753efe187.slice/crio-dad0580afd60f9b9e97309b0badba5536a3c3f519b916092bf0f05276c5b8f72 WatchSource:0}: Error finding container dad0580afd60f9b9e97309b0badba5536a3c3f519b916092bf0f05276c5b8f72: Status 404 returned error can't find the container with id dad0580afd60f9b9e97309b0badba5536a3c3f519b916092bf0f05276c5b8f72 Apr 16 14:52:56.560019 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:56.559978 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gc9vp" event={"ID":"55745b2f-83f9-46da-95cf-59aa391f6226","Type":"ContainerStarted","Data":"00a5f6ec073ca753ab5a5a2de9f5adde5c238bd3f8cb5d8d8804258b2db17584"} Apr 16 14:52:56.562422 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:56.562385 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fd5c8b56f-d4x95" event={"ID":"4d325b53-ff81-45c5-9544-c9d753efe187","Type":"ContainerStarted","Data":"dad0580afd60f9b9e97309b0badba5536a3c3f519b916092bf0f05276c5b8f72"} Apr 16 14:52:56.564060 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:56.564031 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" event={"ID":"f23c413d-24a7-4f72-9a36-fdce46324970","Type":"ContainerStarted","Data":"44331ebeba8f01a0225420949739116ee5b97636d438c3499c61f6a9f9cda230"} Apr 16 14:52:56.836042 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:56.835963 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/927790a5-7672-4d93-a725-5924ae587d09-metrics-tls\") pod \"dns-default-m9hrv\" (UID: \"927790a5-7672-4d93-a725-5924ae587d09\") " pod="openshift-dns/dns-default-m9hrv" Apr 16 14:52:56.836042 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:52:56.836013 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a14533b-b916-4308-8775-7107db9fe6de-cert\") pod \"ingress-canary-4qbwg\" (UID: \"9a14533b-b916-4308-8775-7107db9fe6de\") " pod="openshift-ingress-canary/ingress-canary-4qbwg" Apr 16 14:52:56.836509 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:56.836113 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:52:56.836509 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:56.836124 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:52:56.836509 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:56.836209 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/927790a5-7672-4d93-a725-5924ae587d09-metrics-tls podName:927790a5-7672-4d93-a725-5924ae587d09 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:04.836189644 +0000 UTC m=+49.177320494 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/927790a5-7672-4d93-a725-5924ae587d09-metrics-tls") pod "dns-default-m9hrv" (UID: "927790a5-7672-4d93-a725-5924ae587d09") : secret "dns-default-metrics-tls" not found Apr 16 14:52:56.836509 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:52:56.836233 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a14533b-b916-4308-8775-7107db9fe6de-cert podName:9a14533b-b916-4308-8775-7107db9fe6de nodeName:}" failed. No retries permitted until 2026-04-16 14:53:04.836215768 +0000 UTC m=+49.177346604 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9a14533b-b916-4308-8775-7107db9fe6de-cert") pod "ingress-canary-4qbwg" (UID: "9a14533b-b916-4308-8775-7107db9fe6de") : secret "canary-serving-cert" not found Apr 16 14:53:00.573812 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:00.573774 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fd5c8b56f-d4x95" event={"ID":"4d325b53-ff81-45c5-9544-c9d753efe187","Type":"ContainerStarted","Data":"b952149ed97148c0014b58cecda31e74527627952d20b20eecb76d337ec0f615"} Apr 16 14:53:00.575025 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:00.575002 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" event={"ID":"f23c413d-24a7-4f72-9a36-fdce46324970","Type":"ContainerStarted","Data":"65e755fd6477705498af0259f94bf6ba37e163fa74978041b21889bae3793edd"} Apr 16 14:53:00.591706 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:00.591652 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-gc9vp" podStartSLOduration=37.630610776 podStartE2EDuration="41.591640268s" podCreationTimestamp="2026-04-16 14:52:19 +0000 UTC" firstStartedPulling="2026-04-16 14:52:51.911604715 +0000 UTC m=+36.252735548" lastFinishedPulling="2026-04-16 14:52:55.872634206 +0000 UTC m=+40.213765040" observedRunningTime="2026-04-16 14:52:56.57421308 +0000 UTC m=+40.915343952" watchObservedRunningTime="2026-04-16 14:53:00.591640268 +0000 UTC m=+44.932771124" Apr 16 14:53:00.592197 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:00.592041 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fd5c8b56f-d4x95" podStartSLOduration=3.717956845 podStartE2EDuration="7.592030897s" podCreationTimestamp="2026-04-16 14:52:53 +0000 UTC" firstStartedPulling="2026-04-16 14:52:55.643318502 +0000 UTC m=+39.984449336" lastFinishedPulling="2026-04-16 14:52:59.517392344 +0000 UTC m=+43.858523388" observedRunningTime="2026-04-16 14:53:00.591324077 +0000 UTC m=+44.932454932" watchObservedRunningTime="2026-04-16 14:53:00.592030897 +0000 UTC m=+44.933161753" Apr 16 14:53:03.584120 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:03.584035 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" event={"ID":"f23c413d-24a7-4f72-9a36-fdce46324970","Type":"ContainerStarted","Data":"bf4e2dd9d6d92493bcb5f4b73617662272bd60ecd8abd33e0b7d58bb2f56569a"} Apr 16 14:53:03.584120 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:03.584074 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" event={"ID":"f23c413d-24a7-4f72-9a36-fdce46324970","Type":"ContainerStarted","Data":"5f451581fe801517696b9e9e3e07e40a3a9b1b70c49fbaf8ab8a1ed33272e68e"} Apr 16 14:53:03.604375 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:03.603539 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" podStartSLOduration=2.9292300730000003 podStartE2EDuration="10.603519876s" podCreationTimestamp="2026-04-16 14:52:53 +0000 UTC" firstStartedPulling="2026-04-16 14:52:55.631505766 +0000 UTC m=+39.972636601" lastFinishedPulling="2026-04-16 14:53:03.30579556 +0000 UTC m=+47.646926404" observedRunningTime="2026-04-16 14:53:03.603138539 +0000 UTC m=+47.944269394" watchObservedRunningTime="2026-04-16 14:53:03.603519876 +0000 UTC m=+47.944650792" Apr 16 14:53:04.894765 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:04.894724 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/927790a5-7672-4d93-a725-5924ae587d09-metrics-tls\") pod \"dns-default-m9hrv\" (UID: \"927790a5-7672-4d93-a725-5924ae587d09\") " pod="openshift-dns/dns-default-m9hrv" Apr 16 14:53:04.894765 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:04.894765 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a14533b-b916-4308-8775-7107db9fe6de-cert\") pod \"ingress-canary-4qbwg\" (UID: \"9a14533b-b916-4308-8775-7107db9fe6de\") " pod="openshift-ingress-canary/ingress-canary-4qbwg" Apr 16 14:53:04.895202 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:53:04.894867 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:04.895202 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:53:04.894872 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:04.895202 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:53:04.894917 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a14533b-b916-4308-8775-7107db9fe6de-cert podName:9a14533b-b916-4308-8775-7107db9fe6de nodeName:}" failed. No retries permitted until 2026-04-16 14:53:20.894903318 +0000 UTC m=+65.236034152 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9a14533b-b916-4308-8775-7107db9fe6de-cert") pod "ingress-canary-4qbwg" (UID: "9a14533b-b916-4308-8775-7107db9fe6de") : secret "canary-serving-cert" not found Apr 16 14:53:04.895202 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:53:04.894940 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/927790a5-7672-4d93-a725-5924ae587d09-metrics-tls podName:927790a5-7672-4d93-a725-5924ae587d09 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:20.894925794 +0000 UTC m=+65.236056628 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/927790a5-7672-4d93-a725-5924ae587d09-metrics-tls") pod "dns-default-m9hrv" (UID: "927790a5-7672-4d93-a725-5924ae587d09") : secret "dns-default-metrics-tls" not found Apr 16 14:53:15.536315 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:15.536285 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t4krl" Apr 16 14:53:20.907189 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:20.907132 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/927790a5-7672-4d93-a725-5924ae587d09-metrics-tls\") pod \"dns-default-m9hrv\" (UID: \"927790a5-7672-4d93-a725-5924ae587d09\") " pod="openshift-dns/dns-default-m9hrv" Apr 16 14:53:20.907189 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:20.907194 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a14533b-b916-4308-8775-7107db9fe6de-cert\") pod \"ingress-canary-4qbwg\" (UID: \"9a14533b-b916-4308-8775-7107db9fe6de\") " pod="openshift-ingress-canary/ingress-canary-4qbwg" Apr 16 14:53:20.907684 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:53:20.907293 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:20.907684 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:53:20.907311 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:20.907684 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:53:20.907365 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/927790a5-7672-4d93-a725-5924ae587d09-metrics-tls podName:927790a5-7672-4d93-a725-5924ae587d09 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:52.907349995 +0000 UTC m=+97.248480831 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/927790a5-7672-4d93-a725-5924ae587d09-metrics-tls") pod "dns-default-m9hrv" (UID: "927790a5-7672-4d93-a725-5924ae587d09") : secret "dns-default-metrics-tls" not found Apr 16 14:53:20.907684 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:53:20.907381 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a14533b-b916-4308-8775-7107db9fe6de-cert podName:9a14533b-b916-4308-8775-7107db9fe6de nodeName:}" failed. No retries permitted until 2026-04-16 14:53:52.907376228 +0000 UTC m=+97.248507060 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9a14533b-b916-4308-8775-7107db9fe6de-cert") pod "ingress-canary-4qbwg" (UID: "9a14533b-b916-4308-8775-7107db9fe6de") : secret "canary-serving-cert" not found Apr 16 14:53:21.007938 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:21.007895 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpsk8\" (UniqueName: \"kubernetes.io/projected/82d43552-0266-40be-b011-548c6b1da18a-kube-api-access-xpsk8\") pod \"network-check-target-d7tkp\" (UID: \"82d43552-0266-40be-b011-548c6b1da18a\") " pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:53:21.008138 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:21.007982 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs\") pod \"network-metrics-daemon-j76vn\" (UID: \"38d86a56-d8b6-4bb2-a413-3166ca14717f\") " pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:53:21.010149 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:21.010133 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:53:21.010279 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:21.010262 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:53:21.018978 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:53:21.018955 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:53:21.019078 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:53:21.019045 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs podName:38d86a56-d8b6-4bb2-a413-3166ca14717f nodeName:}" failed. No retries permitted until 2026-04-16 14:54:25.019025035 +0000 UTC m=+129.360155867 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs") pod "network-metrics-daemon-j76vn" (UID: "38d86a56-d8b6-4bb2-a413-3166ca14717f") : secret "metrics-daemon-secret" not found Apr 16 14:53:21.021326 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:21.021309 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:53:21.031915 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:21.031891 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpsk8\" (UniqueName: \"kubernetes.io/projected/82d43552-0266-40be-b011-548c6b1da18a-kube-api-access-xpsk8\") pod \"network-check-target-d7tkp\" (UID: \"82d43552-0266-40be-b011-548c6b1da18a\") " pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:53:21.136324 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:21.136289 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dpmcp\"" Apr 16 14:53:21.145090 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:21.145066 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:53:21.251188 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:21.251114 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-d7tkp"] Apr 16 14:53:21.254872 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:53:21.254844 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82d43552_0266_40be_b011_548c6b1da18a.slice/crio-3c00fd9bf6fc4afc2ae49d074ab46dfcb993149946a037a75526e864d4f2f3f5 WatchSource:0}: Error finding container 3c00fd9bf6fc4afc2ae49d074ab46dfcb993149946a037a75526e864d4f2f3f5: Status 404 returned error can't find the container with id 3c00fd9bf6fc4afc2ae49d074ab46dfcb993149946a037a75526e864d4f2f3f5 Apr 16 14:53:21.619186 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:21.619079 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-d7tkp" event={"ID":"82d43552-0266-40be-b011-548c6b1da18a","Type":"ContainerStarted","Data":"3c00fd9bf6fc4afc2ae49d074ab46dfcb993149946a037a75526e864d4f2f3f5"} Apr 16 14:53:24.628005 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:24.627971 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-d7tkp" event={"ID":"82d43552-0266-40be-b011-548c6b1da18a","Type":"ContainerStarted","Data":"d1ff621b2f8607c2034599b70325a200c9e0a084fcf1131f02246c63eef9492f"} Apr 16 14:53:24.628398 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:24.628221 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:53:24.642645 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:24.642603 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-d7tkp" podStartSLOduration=65.857411839 podStartE2EDuration="1m8.642590649s" podCreationTimestamp="2026-04-16 14:52:16 +0000 UTC" firstStartedPulling="2026-04-16 14:53:21.257211324 +0000 UTC m=+65.598342159" lastFinishedPulling="2026-04-16 14:53:24.042390134 +0000 UTC m=+68.383520969" observedRunningTime="2026-04-16 14:53:24.641916273 +0000 UTC m=+68.983047126" watchObservedRunningTime="2026-04-16 14:53:24.642590649 +0000 UTC m=+68.983721502" Apr 16 14:53:52.931139 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:52.930988 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/927790a5-7672-4d93-a725-5924ae587d09-metrics-tls\") pod \"dns-default-m9hrv\" (UID: \"927790a5-7672-4d93-a725-5924ae587d09\") " pod="openshift-dns/dns-default-m9hrv" Apr 16 14:53:52.931139 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:52.931044 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a14533b-b916-4308-8775-7107db9fe6de-cert\") pod \"ingress-canary-4qbwg\" (UID: \"9a14533b-b916-4308-8775-7107db9fe6de\") " pod="openshift-ingress-canary/ingress-canary-4qbwg" Apr 16 14:53:52.931813 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:53:52.931152 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:52.931813 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:53:52.931245 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a14533b-b916-4308-8775-7107db9fe6de-cert podName:9a14533b-b916-4308-8775-7107db9fe6de nodeName:}" failed. No retries permitted until 2026-04-16 14:54:56.931224243 +0000 UTC m=+161.272355075 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9a14533b-b916-4308-8775-7107db9fe6de-cert") pod "ingress-canary-4qbwg" (UID: "9a14533b-b916-4308-8775-7107db9fe6de") : secret "canary-serving-cert" not found Apr 16 14:53:52.931813 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:53:52.931152 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:52.931813 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:53:52.931315 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/927790a5-7672-4d93-a725-5924ae587d09-metrics-tls podName:927790a5-7672-4d93-a725-5924ae587d09 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:56.931301204 +0000 UTC m=+161.272432037 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/927790a5-7672-4d93-a725-5924ae587d09-metrics-tls") pod "dns-default-m9hrv" (UID: "927790a5-7672-4d93-a725-5924ae587d09") : secret "dns-default-metrics-tls" not found Apr 16 14:53:55.633496 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:53:55.633468 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-d7tkp" Apr 16 14:54:25.055164 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:54:25.055115 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs\") pod \"network-metrics-daemon-j76vn\" (UID: \"38d86a56-d8b6-4bb2-a413-3166ca14717f\") " pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:54:25.055701 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:54:25.055285 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:54:25.055701 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:54:25.055352 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs podName:38d86a56-d8b6-4bb2-a413-3166ca14717f nodeName:}" failed. No retries permitted until 2026-04-16 14:56:27.055334428 +0000 UTC m=+251.396465281 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs") pod "network-metrics-daemon-j76vn" (UID: "38d86a56-d8b6-4bb2-a413-3166ca14717f") : secret "metrics-daemon-secret" not found Apr 16 14:54:37.790291 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:54:37.790262 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-f4596_cdb087ca-e5b6-43aa-88b4-f2d25147cf7e/dns-node-resolver/0.log" Apr 16 14:54:38.789619 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:54:38.789586 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fpnf7_40f100bd-4ea9-4c4c-bfe7-d00fbe4359f5/node-ca/0.log" Apr 16 14:54:52.019881 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:54:52.019831 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-4qbwg" podUID="9a14533b-b916-4308-8775-7107db9fe6de" Apr 16 14:54:52.034952 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:54:52.034929 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-m9hrv" podUID="927790a5-7672-4d93-a725-5924ae587d09" Apr 16 14:54:52.341142 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:54:52.341052 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-j76vn" podUID="38d86a56-d8b6-4bb2-a413-3166ca14717f" Apr 16 14:54:52.828316 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:54:52.828286 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4qbwg" Apr 16 14:54:56.982303 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:54:56.982262 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/927790a5-7672-4d93-a725-5924ae587d09-metrics-tls\") pod \"dns-default-m9hrv\" (UID: \"927790a5-7672-4d93-a725-5924ae587d09\") " pod="openshift-dns/dns-default-m9hrv" Apr 16 14:54:56.982303 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:54:56.982309 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a14533b-b916-4308-8775-7107db9fe6de-cert\") pod \"ingress-canary-4qbwg\" (UID: \"9a14533b-b916-4308-8775-7107db9fe6de\") " pod="openshift-ingress-canary/ingress-canary-4qbwg" Apr 16 14:54:56.984576 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:54:56.984549 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/927790a5-7672-4d93-a725-5924ae587d09-metrics-tls\") pod \"dns-default-m9hrv\" (UID: \"927790a5-7672-4d93-a725-5924ae587d09\") " pod="openshift-dns/dns-default-m9hrv" Apr 16 14:54:56.984576 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:54:56.984572 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a14533b-b916-4308-8775-7107db9fe6de-cert\") pod \"ingress-canary-4qbwg\" (UID: \"9a14533b-b916-4308-8775-7107db9fe6de\") " pod="openshift-ingress-canary/ingress-canary-4qbwg" Apr 16 14:54:57.031549 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:54:57.031522 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gcjjc\"" Apr 16 14:54:57.040221 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:54:57.040192 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4qbwg" Apr 16 14:54:57.149824 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:54:57.149789 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4qbwg"] Apr 16 14:54:57.152743 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:54:57.152710 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a14533b_b916_4308_8775_7107db9fe6de.slice/crio-61adef14733826c55e273d0e5a2c47ceda3adb62348073d6d7d43bc012fad95a WatchSource:0}: Error finding container 61adef14733826c55e273d0e5a2c47ceda3adb62348073d6d7d43bc012fad95a: Status 404 returned error can't find the container with id 61adef14733826c55e273d0e5a2c47ceda3adb62348073d6d7d43bc012fad95a Apr 16 14:54:57.839952 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:54:57.839907 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4qbwg" event={"ID":"9a14533b-b916-4308-8775-7107db9fe6de","Type":"ContainerStarted","Data":"61adef14733826c55e273d0e5a2c47ceda3adb62348073d6d7d43bc012fad95a"} Apr 16 14:54:58.843699 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:54:58.843660 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4qbwg" event={"ID":"9a14533b-b916-4308-8775-7107db9fe6de","Type":"ContainerStarted","Data":"c6a9d2a4c967b1cc9152d445b51be4c4d594ffcd3c91065a84cb4e10b0ac896c"} Apr 16 14:54:58.856996 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:54:58.856787 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4qbwg" podStartSLOduration=129.23030623 podStartE2EDuration="2m10.856769212s" podCreationTimestamp="2026-04-16 14:52:48 +0000 UTC" firstStartedPulling="2026-04-16 14:54:57.15452218 +0000 UTC m=+161.495653029" lastFinishedPulling="2026-04-16 14:54:58.780985167 +0000 UTC m=+163.122116011" observedRunningTime="2026-04-16 14:54:58.856363099 +0000 UTC m=+163.197493967" watchObservedRunningTime="2026-04-16 14:54:58.856769212 +0000 UTC m=+163.197900066" Apr 16 14:54:59.846752 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:54:59.846718 2565 generic.go:358] "Generic (PLEG): container finished" podID="4d325b53-ff81-45c5-9544-c9d753efe187" containerID="b952149ed97148c0014b58cecda31e74527627952d20b20eecb76d337ec0f615" exitCode=255 Apr 16 14:54:59.847238 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:54:59.846795 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fd5c8b56f-d4x95" event={"ID":"4d325b53-ff81-45c5-9544-c9d753efe187","Type":"ContainerDied","Data":"b952149ed97148c0014b58cecda31e74527627952d20b20eecb76d337ec0f615"} Apr 16 14:54:59.847238 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:54:59.847184 2565 scope.go:117] "RemoveContainer" containerID="b952149ed97148c0014b58cecda31e74527627952d20b20eecb76d337ec0f615" Apr 16 14:55:00.850601 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:00.850564 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fd5c8b56f-d4x95" event={"ID":"4d325b53-ff81-45c5-9544-c9d753efe187","Type":"ContainerStarted","Data":"fb4994c75e99418f073dd9b5808074672920859520d0bcd6b1a39601087ec5f4"} Apr 16 14:55:03.317942 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:03.317910 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-chpgs"] Apr 16 14:55:03.320827 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:03.320807 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-chpgs" Apr 16 14:55:03.323564 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:03.323531 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-clwz5\"" Apr 16 14:55:03.323685 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:03.323605 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:55:03.324303 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:03.324288 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:55:03.324619 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:03.324601 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:55:03.324691 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:03.324607 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:55:03.337991 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:03.337966 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-chpgs"] Apr 16 14:55:03.425116 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:03.425077 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7c771f50-4b0e-4280-b47b-81da44e68d3d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-chpgs\" (UID: \"7c771f50-4b0e-4280-b47b-81da44e68d3d\") " pod="openshift-insights/insights-runtime-extractor-chpgs" Apr 16 14:55:03.425116 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:03.425121 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7c771f50-4b0e-4280-b47b-81da44e68d3d-crio-socket\") pod \"insights-runtime-extractor-chpgs\" (UID: \"7c771f50-4b0e-4280-b47b-81da44e68d3d\") " pod="openshift-insights/insights-runtime-extractor-chpgs" Apr 16 14:55:03.425409 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:03.425216 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt2gm\" (UniqueName: \"kubernetes.io/projected/7c771f50-4b0e-4280-b47b-81da44e68d3d-kube-api-access-vt2gm\") pod \"insights-runtime-extractor-chpgs\" (UID: \"7c771f50-4b0e-4280-b47b-81da44e68d3d\") " pod="openshift-insights/insights-runtime-extractor-chpgs" Apr 16 14:55:03.425409 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:03.425259 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7c771f50-4b0e-4280-b47b-81da44e68d3d-data-volume\") pod \"insights-runtime-extractor-chpgs\" (UID: \"7c771f50-4b0e-4280-b47b-81da44e68d3d\") " pod="openshift-insights/insights-runtime-extractor-chpgs" Apr 16 14:55:03.425409 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:03.425300 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7c771f50-4b0e-4280-b47b-81da44e68d3d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-chpgs\" (UID: \"7c771f50-4b0e-4280-b47b-81da44e68d3d\") " pod="openshift-insights/insights-runtime-extractor-chpgs" Apr 16 14:55:03.526272 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:03.526227 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7c771f50-4b0e-4280-b47b-81da44e68d3d-crio-socket\") pod \"insights-runtime-extractor-chpgs\" (UID: \"7c771f50-4b0e-4280-b47b-81da44e68d3d\") " pod="openshift-insights/insights-runtime-extractor-chpgs" Apr 16 14:55:03.526272 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:03.526277 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vt2gm\" (UniqueName: \"kubernetes.io/projected/7c771f50-4b0e-4280-b47b-81da44e68d3d-kube-api-access-vt2gm\") pod \"insights-runtime-extractor-chpgs\" (UID: \"7c771f50-4b0e-4280-b47b-81da44e68d3d\") " pod="openshift-insights/insights-runtime-extractor-chpgs" Apr 16 14:55:03.526512 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:03.526298 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7c771f50-4b0e-4280-b47b-81da44e68d3d-data-volume\") pod \"insights-runtime-extractor-chpgs\" (UID: \"7c771f50-4b0e-4280-b47b-81da44e68d3d\") " pod="openshift-insights/insights-runtime-extractor-chpgs" Apr 16 14:55:03.526512 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:03.526326 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7c771f50-4b0e-4280-b47b-81da44e68d3d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-chpgs\" (UID: \"7c771f50-4b0e-4280-b47b-81da44e68d3d\") " pod="openshift-insights/insights-runtime-extractor-chpgs" Apr 16 14:55:03.526512 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:03.526345 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7c771f50-4b0e-4280-b47b-81da44e68d3d-crio-socket\") pod \"insights-runtime-extractor-chpgs\" (UID: \"7c771f50-4b0e-4280-b47b-81da44e68d3d\") " pod="openshift-insights/insights-runtime-extractor-chpgs" Apr 16 14:55:03.526512 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:03.526356 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7c771f50-4b0e-4280-b47b-81da44e68d3d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-chpgs\" (UID: \"7c771f50-4b0e-4280-b47b-81da44e68d3d\") " pod="openshift-insights/insights-runtime-extractor-chpgs" Apr 16 14:55:03.526728 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:03.526653 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7c771f50-4b0e-4280-b47b-81da44e68d3d-data-volume\") pod \"insights-runtime-extractor-chpgs\" (UID: \"7c771f50-4b0e-4280-b47b-81da44e68d3d\") " pod="openshift-insights/insights-runtime-extractor-chpgs" Apr 16 14:55:03.526962 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:03.526940 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7c771f50-4b0e-4280-b47b-81da44e68d3d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-chpgs\" (UID: \"7c771f50-4b0e-4280-b47b-81da44e68d3d\") " pod="openshift-insights/insights-runtime-extractor-chpgs" Apr 16 14:55:03.528474 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:03.528449 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7c771f50-4b0e-4280-b47b-81da44e68d3d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-chpgs\" (UID: \"7c771f50-4b0e-4280-b47b-81da44e68d3d\") " pod="openshift-insights/insights-runtime-extractor-chpgs" Apr 16 14:55:03.536328 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:03.536301 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt2gm\" (UniqueName: \"kubernetes.io/projected/7c771f50-4b0e-4280-b47b-81da44e68d3d-kube-api-access-vt2gm\") pod \"insights-runtime-extractor-chpgs\" (UID: \"7c771f50-4b0e-4280-b47b-81da44e68d3d\") " pod="openshift-insights/insights-runtime-extractor-chpgs" Apr 16 14:55:03.629667 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:03.629575 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-chpgs" Apr 16 14:55:03.746984 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:03.746954 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-chpgs"] Apr 16 14:55:03.750219 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:55:03.750191 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c771f50_4b0e_4280_b47b_81da44e68d3d.slice/crio-6efc59e9096251db5867f601cf51d557d38ff7fcc0de9f94dad8f6ecc05e9c83 WatchSource:0}: Error finding container 6efc59e9096251db5867f601cf51d557d38ff7fcc0de9f94dad8f6ecc05e9c83: Status 404 returned error can't find the container with id 6efc59e9096251db5867f601cf51d557d38ff7fcc0de9f94dad8f6ecc05e9c83 Apr 16 14:55:03.859688 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:03.859652 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-chpgs" event={"ID":"7c771f50-4b0e-4280-b47b-81da44e68d3d","Type":"ContainerStarted","Data":"d2b1c5e6fa19e7acbc1596d63dc8fd7944bea7a8761d2022b482ee33b307a83e"} Apr 16 14:55:03.859688 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:03.859693 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-chpgs" event={"ID":"7c771f50-4b0e-4280-b47b-81da44e68d3d","Type":"ContainerStarted","Data":"6efc59e9096251db5867f601cf51d557d38ff7fcc0de9f94dad8f6ecc05e9c83"} Apr 16 14:55:04.862925 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:04.862837 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-chpgs" event={"ID":"7c771f50-4b0e-4280-b47b-81da44e68d3d","Type":"ContainerStarted","Data":"9f5c64d93c2b7504d7d8d21667cf636dab19f9f1f3072622e2199c89d97823c9"} Apr 16 14:55:05.322801 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:05.322763 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-m9hrv" Apr 16 14:55:05.325295 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:05.325275 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-njff8\"" Apr 16 14:55:05.334153 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:05.334117 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-m9hrv" Apr 16 14:55:05.467594 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:05.467558 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-m9hrv"] Apr 16 14:55:05.471813 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:55:05.471773 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod927790a5_7672_4d93_a725_5924ae587d09.slice/crio-11d9ef1da72fdaf20027ad03eb04baa1fe2e79206ca3400fce46db9f40dad74c WatchSource:0}: Error finding container 11d9ef1da72fdaf20027ad03eb04baa1fe2e79206ca3400fce46db9f40dad74c: Status 404 returned error can't find the container with id 11d9ef1da72fdaf20027ad03eb04baa1fe2e79206ca3400fce46db9f40dad74c Apr 16 14:55:05.866929 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:05.866886 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m9hrv" event={"ID":"927790a5-7672-4d93-a725-5924ae587d09","Type":"ContainerStarted","Data":"11d9ef1da72fdaf20027ad03eb04baa1fe2e79206ca3400fce46db9f40dad74c"} Apr 16 14:55:06.325403 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:06.325365 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:55:06.871144 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:06.871088 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-chpgs" event={"ID":"7c771f50-4b0e-4280-b47b-81da44e68d3d","Type":"ContainerStarted","Data":"497e9ad948122636c7bcea0b6c2cb3d61f1f564b54bfd3d239b6e2e8ac27e5f1"} Apr 16 14:55:06.891980 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:06.891902 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-chpgs" podStartSLOduration=1.765736142 podStartE2EDuration="3.891883696s" podCreationTimestamp="2026-04-16 14:55:03 +0000 UTC" firstStartedPulling="2026-04-16 14:55:03.809729076 +0000 UTC m=+168.150859909" lastFinishedPulling="2026-04-16 14:55:05.935876626 +0000 UTC m=+170.277007463" observedRunningTime="2026-04-16 14:55:06.890704571 +0000 UTC m=+171.231835423" watchObservedRunningTime="2026-04-16 14:55:06.891883696 +0000 UTC m=+171.233014552" Apr 16 14:55:07.876876 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:07.876836 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m9hrv" event={"ID":"927790a5-7672-4d93-a725-5924ae587d09","Type":"ContainerStarted","Data":"634477c85eb8bc5746622bca08c797635708c325d4d93aa3e512efe6f769170d"} Apr 16 14:55:07.877280 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:07.876886 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m9hrv" event={"ID":"927790a5-7672-4d93-a725-5924ae587d09","Type":"ContainerStarted","Data":"c122117f9a968126a0e5e1c0c3eeb3aa89eab99dfa22853e1de9b02ee98babcc"} Apr 16 14:55:07.877280 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:07.877005 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-m9hrv" Apr 16 14:55:07.902415 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:07.902351 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-m9hrv" podStartSLOduration=138.585428112 podStartE2EDuration="2m19.902333649s" podCreationTimestamp="2026-04-16 14:52:48 +0000 UTC" firstStartedPulling="2026-04-16 14:55:05.474048147 +0000 UTC m=+169.815178981" lastFinishedPulling="2026-04-16 14:55:06.790953668 +0000 UTC m=+171.132084518" observedRunningTime="2026-04-16 14:55:07.901980618 +0000 UTC m=+172.243111477" watchObservedRunningTime="2026-04-16 14:55:07.902333649 +0000 UTC m=+172.243464500" Apr 16 14:55:13.419768 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.419732 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-srtsh"] Apr 16 14:55:13.422986 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.422966 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-srtsh" Apr 16 14:55:13.426106 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.426086 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:55:13.426804 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.426780 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:55:13.426904 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.426787 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:55:13.427115 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.427090 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 14:55:13.427228 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.427094 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 14:55:13.427228 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.427219 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-p4qf5\"" Apr 16 14:55:13.427352 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.427334 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 14:55:13.437980 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.437955 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-srtsh"] Apr 16 14:55:13.439007 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.438988 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-d4tk9"] Apr 16 14:55:13.441947 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.441929 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.445261 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.445229 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:55:13.445367 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.445234 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:55:13.445367 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.445339 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mcggm\"" Apr 16 14:55:13.445477 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.445460 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:55:13.500894 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.500859 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/98083ded-4c68-4825-94cf-619a9f409bd2-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-srtsh\" (UID: \"98083ded-4c68-4825-94cf-619a9f409bd2\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-srtsh" Apr 16 14:55:13.501098 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.500910 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c5878d8f-236e-48f5-bfaf-d655e638f782-node-exporter-tls\") pod \"node-exporter-d4tk9\" (UID: \"c5878d8f-236e-48f5-bfaf-d655e638f782\") " pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.501098 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.500967 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mdbl\" (UniqueName: \"kubernetes.io/projected/c5878d8f-236e-48f5-bfaf-d655e638f782-kube-api-access-9mdbl\") pod \"node-exporter-d4tk9\" (UID: \"c5878d8f-236e-48f5-bfaf-d655e638f782\") " pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.501098 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.501007 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4cpf\" (UniqueName: \"kubernetes.io/projected/98083ded-4c68-4825-94cf-619a9f409bd2-kube-api-access-r4cpf\") pod \"kube-state-metrics-7479c89684-srtsh\" (UID: \"98083ded-4c68-4825-94cf-619a9f409bd2\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-srtsh" Apr 16 14:55:13.501098 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.501038 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c5878d8f-236e-48f5-bfaf-d655e638f782-sys\") pod \"node-exporter-d4tk9\" (UID: \"c5878d8f-236e-48f5-bfaf-d655e638f782\") " pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.501098 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.501091 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/98083ded-4c68-4825-94cf-619a9f409bd2-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-srtsh\" (UID: \"98083ded-4c68-4825-94cf-619a9f409bd2\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-srtsh" Apr 16 14:55:13.501345 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.501113 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/98083ded-4c68-4825-94cf-619a9f409bd2-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-srtsh\" (UID: \"98083ded-4c68-4825-94cf-619a9f409bd2\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-srtsh" Apr 16 14:55:13.501345 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.501135 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c5878d8f-236e-48f5-bfaf-d655e638f782-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-d4tk9\" (UID: \"c5878d8f-236e-48f5-bfaf-d655e638f782\") " pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.501345 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.501200 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/98083ded-4c68-4825-94cf-619a9f409bd2-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-srtsh\" (UID: \"98083ded-4c68-4825-94cf-619a9f409bd2\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-srtsh" Apr 16 14:55:13.501345 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.501245 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c5878d8f-236e-48f5-bfaf-d655e638f782-root\") pod \"node-exporter-d4tk9\" (UID: \"c5878d8f-236e-48f5-bfaf-d655e638f782\") " pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.501345 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.501263 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c5878d8f-236e-48f5-bfaf-d655e638f782-node-exporter-wtmp\") pod \"node-exporter-d4tk9\" (UID: \"c5878d8f-236e-48f5-bfaf-d655e638f782\") " pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.501345 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.501304 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c5878d8f-236e-48f5-bfaf-d655e638f782-node-exporter-accelerators-collector-config\") pod \"node-exporter-d4tk9\" (UID: \"c5878d8f-236e-48f5-bfaf-d655e638f782\") " pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.501345 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.501327 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/98083ded-4c68-4825-94cf-619a9f409bd2-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-srtsh\" (UID: \"98083ded-4c68-4825-94cf-619a9f409bd2\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-srtsh" Apr 16 14:55:13.501345 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.501344 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c5878d8f-236e-48f5-bfaf-d655e638f782-metrics-client-ca\") pod \"node-exporter-d4tk9\" (UID: \"c5878d8f-236e-48f5-bfaf-d655e638f782\") " pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.501590 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.501361 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c5878d8f-236e-48f5-bfaf-d655e638f782-node-exporter-textfile\") pod \"node-exporter-d4tk9\" (UID: \"c5878d8f-236e-48f5-bfaf-d655e638f782\") " pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.602650 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.602608 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c5878d8f-236e-48f5-bfaf-d655e638f782-node-exporter-tls\") pod \"node-exporter-d4tk9\" (UID: \"c5878d8f-236e-48f5-bfaf-d655e638f782\") " pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.602650 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.602651 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mdbl\" (UniqueName: \"kubernetes.io/projected/c5878d8f-236e-48f5-bfaf-d655e638f782-kube-api-access-9mdbl\") pod \"node-exporter-d4tk9\" (UID: \"c5878d8f-236e-48f5-bfaf-d655e638f782\") " pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.602912 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.602671 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4cpf\" (UniqueName: \"kubernetes.io/projected/98083ded-4c68-4825-94cf-619a9f409bd2-kube-api-access-r4cpf\") pod \"kube-state-metrics-7479c89684-srtsh\" (UID: \"98083ded-4c68-4825-94cf-619a9f409bd2\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-srtsh" Apr 16 14:55:13.602912 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.602694 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c5878d8f-236e-48f5-bfaf-d655e638f782-sys\") pod \"node-exporter-d4tk9\" (UID: \"c5878d8f-236e-48f5-bfaf-d655e638f782\") " pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.602912 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.602732 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/98083ded-4c68-4825-94cf-619a9f409bd2-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-srtsh\" (UID: \"98083ded-4c68-4825-94cf-619a9f409bd2\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-srtsh" Apr 16 14:55:13.602912 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.602756 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/98083ded-4c68-4825-94cf-619a9f409bd2-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-srtsh\" (UID: \"98083ded-4c68-4825-94cf-619a9f409bd2\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-srtsh" Apr 16 14:55:13.602912 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.602783 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c5878d8f-236e-48f5-bfaf-d655e638f782-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-d4tk9\" (UID: \"c5878d8f-236e-48f5-bfaf-d655e638f782\") " pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.602912 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.602809 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/98083ded-4c68-4825-94cf-619a9f409bd2-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-srtsh\" (UID: \"98083ded-4c68-4825-94cf-619a9f409bd2\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-srtsh" Apr 16 14:55:13.602912 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.602836 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c5878d8f-236e-48f5-bfaf-d655e638f782-root\") pod \"node-exporter-d4tk9\" (UID: \"c5878d8f-236e-48f5-bfaf-d655e638f782\") " pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.602912 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.602832 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c5878d8f-236e-48f5-bfaf-d655e638f782-sys\") pod \"node-exporter-d4tk9\" (UID: \"c5878d8f-236e-48f5-bfaf-d655e638f782\") " pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.602912 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.602858 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c5878d8f-236e-48f5-bfaf-d655e638f782-node-exporter-wtmp\") pod \"node-exporter-d4tk9\" (UID: \"c5878d8f-236e-48f5-bfaf-d655e638f782\") " pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.603377 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.602956 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c5878d8f-236e-48f5-bfaf-d655e638f782-node-exporter-accelerators-collector-config\") pod \"node-exporter-d4tk9\" (UID: \"c5878d8f-236e-48f5-bfaf-d655e638f782\") " pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.603377 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.602963 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c5878d8f-236e-48f5-bfaf-d655e638f782-node-exporter-wtmp\") pod \"node-exporter-d4tk9\" (UID: \"c5878d8f-236e-48f5-bfaf-d655e638f782\") " pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.603377 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.603113 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/98083ded-4c68-4825-94cf-619a9f409bd2-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-srtsh\" (UID: \"98083ded-4c68-4825-94cf-619a9f409bd2\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-srtsh" Apr 16 14:55:13.603377 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.603146 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c5878d8f-236e-48f5-bfaf-d655e638f782-metrics-client-ca\") pod \"node-exporter-d4tk9\" (UID: \"c5878d8f-236e-48f5-bfaf-d655e638f782\") " pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.603377 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.603206 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c5878d8f-236e-48f5-bfaf-d655e638f782-node-exporter-textfile\") pod \"node-exporter-d4tk9\" (UID: \"c5878d8f-236e-48f5-bfaf-d655e638f782\") " pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.603377 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.603250 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/98083ded-4c68-4825-94cf-619a9f409bd2-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-srtsh\" (UID: \"98083ded-4c68-4825-94cf-619a9f409bd2\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-srtsh" Apr 16 14:55:13.603377 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.603273 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/98083ded-4c68-4825-94cf-619a9f409bd2-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-srtsh\" (UID: \"98083ded-4c68-4825-94cf-619a9f409bd2\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-srtsh" Apr 16 14:55:13.603717 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.603550 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c5878d8f-236e-48f5-bfaf-d655e638f782-root\") pod \"node-exporter-d4tk9\" (UID: \"c5878d8f-236e-48f5-bfaf-d655e638f782\") " pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.603717 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.603569 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c5878d8f-236e-48f5-bfaf-d655e638f782-node-exporter-textfile\") pod \"node-exporter-d4tk9\" (UID: \"c5878d8f-236e-48f5-bfaf-d655e638f782\") " pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.603717 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.603574 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/98083ded-4c68-4825-94cf-619a9f409bd2-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-srtsh\" (UID: \"98083ded-4c68-4825-94cf-619a9f409bd2\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-srtsh" Apr 16 14:55:13.603717 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.603581 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/98083ded-4c68-4825-94cf-619a9f409bd2-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-srtsh\" (UID: \"98083ded-4c68-4825-94cf-619a9f409bd2\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-srtsh" Apr 16 14:55:13.603917 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.603833 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c5878d8f-236e-48f5-bfaf-d655e638f782-node-exporter-accelerators-collector-config\") pod \"node-exporter-d4tk9\" (UID: \"c5878d8f-236e-48f5-bfaf-d655e638f782\") " pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.603990 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.603969 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c5878d8f-236e-48f5-bfaf-d655e638f782-metrics-client-ca\") pod \"node-exporter-d4tk9\" (UID: \"c5878d8f-236e-48f5-bfaf-d655e638f782\") " pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.605222 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.605201 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c5878d8f-236e-48f5-bfaf-d655e638f782-node-exporter-tls\") pod \"node-exporter-d4tk9\" (UID: \"c5878d8f-236e-48f5-bfaf-d655e638f782\") " pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.605434 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.605414 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c5878d8f-236e-48f5-bfaf-d655e638f782-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-d4tk9\" (UID: \"c5878d8f-236e-48f5-bfaf-d655e638f782\") " pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.605612 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.605587 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/98083ded-4c68-4825-94cf-619a9f409bd2-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-srtsh\" (UID: \"98083ded-4c68-4825-94cf-619a9f409bd2\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-srtsh" Apr 16 14:55:13.605849 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.605832 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/98083ded-4c68-4825-94cf-619a9f409bd2-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-srtsh\" (UID: \"98083ded-4c68-4825-94cf-619a9f409bd2\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-srtsh" Apr 16 14:55:13.610845 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.610818 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mdbl\" (UniqueName: \"kubernetes.io/projected/c5878d8f-236e-48f5-bfaf-d655e638f782-kube-api-access-9mdbl\") pod \"node-exporter-d4tk9\" (UID: \"c5878d8f-236e-48f5-bfaf-d655e638f782\") " pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.610955 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.610889 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4cpf\" (UniqueName: \"kubernetes.io/projected/98083ded-4c68-4825-94cf-619a9f409bd2-kube-api-access-r4cpf\") pod \"kube-state-metrics-7479c89684-srtsh\" (UID: \"98083ded-4c68-4825-94cf-619a9f409bd2\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-srtsh" Apr 16 14:55:13.732679 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.732563 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-srtsh" Apr 16 14:55:13.751565 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.751535 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-d4tk9" Apr 16 14:55:13.759380 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:55:13.759344 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5878d8f_236e_48f5_bfaf_d655e638f782.slice/crio-7a2bff252b0c942b8596a19b3bd31a73bce9aed9d7acf063f2700e7040d82547 WatchSource:0}: Error finding container 7a2bff252b0c942b8596a19b3bd31a73bce9aed9d7acf063f2700e7040d82547: Status 404 returned error can't find the container with id 7a2bff252b0c942b8596a19b3bd31a73bce9aed9d7acf063f2700e7040d82547 Apr 16 14:55:13.852705 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.852651 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-srtsh"] Apr 16 14:55:13.855869 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:55:13.855845 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98083ded_4c68_4825_94cf_619a9f409bd2.slice/crio-ba4c3389433d66d88f333a285b3b7bca8dbd8c3757bb9de6be8f0cc5e5f83e42 WatchSource:0}: Error finding container ba4c3389433d66d88f333a285b3b7bca8dbd8c3757bb9de6be8f0cc5e5f83e42: Status 404 returned error can't find the container with id ba4c3389433d66d88f333a285b3b7bca8dbd8c3757bb9de6be8f0cc5e5f83e42 Apr 16 14:55:13.892409 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.892374 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-srtsh" event={"ID":"98083ded-4c68-4825-94cf-619a9f409bd2","Type":"ContainerStarted","Data":"ba4c3389433d66d88f333a285b3b7bca8dbd8c3757bb9de6be8f0cc5e5f83e42"} Apr 16 14:55:13.893437 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:13.893406 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d4tk9" event={"ID":"c5878d8f-236e-48f5-bfaf-d655e638f782","Type":"ContainerStarted","Data":"7a2bff252b0c942b8596a19b3bd31a73bce9aed9d7acf063f2700e7040d82547"} Apr 16 14:55:14.493607 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.493568 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:55:14.496908 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.496890 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.499071 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.499050 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 14:55:14.499071 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.499067 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 14:55:14.499269 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.499082 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 14:55:14.499460 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.499435 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 14:55:14.499585 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.499486 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 14:55:14.499585 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.499444 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 14:55:14.499585 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.499566 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 14:55:14.499585 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.499571 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-9kvkc\"" Apr 16 14:55:14.499776 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.499620 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 14:55:14.499881 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.499864 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 14:55:14.515224 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.515196 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:55:14.612357 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.612322 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/89dc6935-179d-4a46-b543-0cbdf8c01244-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.612501 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.612369 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-config-volume\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.612501 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.612443 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89dc6935-179d-4a46-b543-0cbdf8c01244-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.612501 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.612487 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.612660 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.612505 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89dc6935-179d-4a46-b543-0cbdf8c01244-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.612660 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.612613 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89dc6935-179d-4a46-b543-0cbdf8c01244-tls-assets\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.612660 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.612656 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89dc6935-179d-4a46-b543-0cbdf8c01244-config-out\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.612770 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.612709 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.612770 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.612730 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.612770 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.612763 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.612882 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.612794 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.612922 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.612879 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-web-config\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.612922 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.612915 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9zwk\" (UniqueName: \"kubernetes.io/projected/89dc6935-179d-4a46-b543-0cbdf8c01244-kube-api-access-q9zwk\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.713901 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.713807 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q9zwk\" (UniqueName: \"kubernetes.io/projected/89dc6935-179d-4a46-b543-0cbdf8c01244-kube-api-access-q9zwk\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.713901 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.713863 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/89dc6935-179d-4a46-b543-0cbdf8c01244-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.713901 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.713895 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-config-volume\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.714202 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.713920 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89dc6935-179d-4a46-b543-0cbdf8c01244-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.714202 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.713958 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.714202 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.713982 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89dc6935-179d-4a46-b543-0cbdf8c01244-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.714202 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.714017 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89dc6935-179d-4a46-b543-0cbdf8c01244-tls-assets\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.714202 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.714035 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89dc6935-179d-4a46-b543-0cbdf8c01244-config-out\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.714202 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.714072 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.714202 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.714097 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.714202 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.714124 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.714202 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.714158 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.714679 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.714235 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-web-config\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.714679 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.714354 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/89dc6935-179d-4a46-b543-0cbdf8c01244-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.714679 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:55:14.714381 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/89dc6935-179d-4a46-b543-0cbdf8c01244-alertmanager-trusted-ca-bundle podName:89dc6935-179d-4a46-b543-0cbdf8c01244 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:15.214356461 +0000 UTC m=+179.555487310 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/89dc6935-179d-4a46-b543-0cbdf8c01244-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "89dc6935-179d-4a46-b543-0cbdf8c01244") : configmap references non-existent config key: ca-bundle.crt Apr 16 14:55:14.714851 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.714721 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89dc6935-179d-4a46-b543-0cbdf8c01244-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.714851 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:55:14.714824 2565 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 14:55:14.714925 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:55:14.714894 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-secret-alertmanager-main-tls podName:89dc6935-179d-4a46-b543-0cbdf8c01244 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:15.214876475 +0000 UTC m=+179.556007323 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "89dc6935-179d-4a46-b543-0cbdf8c01244") : secret "alertmanager-main-tls" not found Apr 16 14:55:14.717215 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.717188 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.717445 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.717403 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89dc6935-179d-4a46-b543-0cbdf8c01244-config-out\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.718148 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.718120 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.718419 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.718397 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89dc6935-179d-4a46-b543-0cbdf8c01244-tls-assets\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.718732 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.718673 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.718975 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.718956 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-web-config\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.719116 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.719093 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.719905 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.719882 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-config-volume\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.722516 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.722493 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9zwk\" (UniqueName: \"kubernetes.io/projected/89dc6935-179d-4a46-b543-0cbdf8c01244-kube-api-access-q9zwk\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:14.898093 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.898050 2565 generic.go:358] "Generic (PLEG): container finished" podID="c5878d8f-236e-48f5-bfaf-d655e638f782" containerID="62755c1122b38f8002b75b56b1c7d7cf52c608a200d718a796c6c4456afa9826" exitCode=0 Apr 16 14:55:14.898258 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:14.898124 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d4tk9" event={"ID":"c5878d8f-236e-48f5-bfaf-d655e638f782","Type":"ContainerDied","Data":"62755c1122b38f8002b75b56b1c7d7cf52c608a200d718a796c6c4456afa9826"} Apr 16 14:55:15.218511 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:15.218476 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:15.218655 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:15.218568 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89dc6935-179d-4a46-b543-0cbdf8c01244-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:15.219519 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:15.219496 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89dc6935-179d-4a46-b543-0cbdf8c01244-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:15.220970 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:15.220949 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:15.405906 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:15.405879 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:15.548985 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:15.548938 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:55:15.551694 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:55:15.551669 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89dc6935_179d_4a46_b543_0cbdf8c01244.slice/crio-b0aaeaf13242dfecbeb2afbf76d4d197d1dc014875f956a7ac448aacd2326078 WatchSource:0}: Error finding container b0aaeaf13242dfecbeb2afbf76d4d197d1dc014875f956a7ac448aacd2326078: Status 404 returned error can't find the container with id b0aaeaf13242dfecbeb2afbf76d4d197d1dc014875f956a7ac448aacd2326078 Apr 16 14:55:15.902262 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:15.902226 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-srtsh" event={"ID":"98083ded-4c68-4825-94cf-619a9f409bd2","Type":"ContainerStarted","Data":"27f1108bdaad3624e99e2f3fb5a315af9dfd904d0fdaba9949e7ddac7cf1ce44"} Apr 16 14:55:15.902436 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:15.902270 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-srtsh" event={"ID":"98083ded-4c68-4825-94cf-619a9f409bd2","Type":"ContainerStarted","Data":"cb464cd8ea2b2e8c796f14326bd2f0f8b7bdb4f159588a3bca75c812ad724494"} Apr 16 14:55:15.902436 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:15.902283 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-srtsh" event={"ID":"98083ded-4c68-4825-94cf-619a9f409bd2","Type":"ContainerStarted","Data":"1b825beb3ee14fe00503cc0409ae44829a349887e2c0903853741d315ee80a20"} Apr 16 14:55:15.903340 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:15.903316 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"89dc6935-179d-4a46-b543-0cbdf8c01244","Type":"ContainerStarted","Data":"b0aaeaf13242dfecbeb2afbf76d4d197d1dc014875f956a7ac448aacd2326078"} Apr 16 14:55:15.905095 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:15.905074 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d4tk9" event={"ID":"c5878d8f-236e-48f5-bfaf-d655e638f782","Type":"ContainerStarted","Data":"c27aa0b5cd4f8a0f2440608d1b725e1b40cac8a9f1c0e398dc2a746817b103da"} Apr 16 14:55:15.905199 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:15.905104 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d4tk9" event={"ID":"c5878d8f-236e-48f5-bfaf-d655e638f782","Type":"ContainerStarted","Data":"e493bf8405c0f955c7009692beda501c94d76fa1243890db0908a7b9e6eabe2f"} Apr 16 14:55:15.918583 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:15.918540 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-srtsh" podStartSLOduration=1.515930102 podStartE2EDuration="2.918527853s" podCreationTimestamp="2026-04-16 14:55:13 +0000 UTC" firstStartedPulling="2026-04-16 14:55:13.857658671 +0000 UTC m=+178.198789504" lastFinishedPulling="2026-04-16 14:55:15.26025642 +0000 UTC m=+179.601387255" observedRunningTime="2026-04-16 14:55:15.916914139 +0000 UTC m=+180.258044993" watchObservedRunningTime="2026-04-16 14:55:15.918527853 +0000 UTC m=+180.259658741" Apr 16 14:55:15.934138 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:15.934097 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-d4tk9" podStartSLOduration=2.287460274 podStartE2EDuration="2.93408652s" podCreationTimestamp="2026-04-16 14:55:13 +0000 UTC" firstStartedPulling="2026-04-16 14:55:13.761310962 +0000 UTC m=+178.102441796" lastFinishedPulling="2026-04-16 14:55:14.407937206 +0000 UTC m=+178.749068042" observedRunningTime="2026-04-16 14:55:15.933074791 +0000 UTC m=+180.274205651" watchObservedRunningTime="2026-04-16 14:55:15.93408652 +0000 UTC m=+180.275217374" Apr 16 14:55:16.909383 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:16.909349 2565 generic.go:358] "Generic (PLEG): container finished" podID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerID="aa27c02c1d1623d116288a6f8ff0670cf5379e0889badd796d021bafe765fdd4" exitCode=0 Apr 16 14:55:16.909773 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:16.909433 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"89dc6935-179d-4a46-b543-0cbdf8c01244","Type":"ContainerDied","Data":"aa27c02c1d1623d116288a6f8ff0670cf5379e0889badd796d021bafe765fdd4"} Apr 16 14:55:17.882321 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:17.882277 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-m9hrv" Apr 16 14:55:18.920592 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:18.920559 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"89dc6935-179d-4a46-b543-0cbdf8c01244","Type":"ContainerStarted","Data":"ab67770f80ccf9c3e174cc686ca235ee55cf7c4d5569433d0e93a5f7239520f0"} Apr 16 14:55:18.920592 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:18.920595 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"89dc6935-179d-4a46-b543-0cbdf8c01244","Type":"ContainerStarted","Data":"f8b5a918e2722949b76f784e3324baa90c1d70c7de8d250d24e7a0a349e166a2"} Apr 16 14:55:18.920988 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:18.920605 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"89dc6935-179d-4a46-b543-0cbdf8c01244","Type":"ContainerStarted","Data":"62b54838426e63510fda98fd5f2c1400ae04227b9110cd6ca92c33e08a09ba2c"} Apr 16 14:55:18.920988 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:18.920614 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"89dc6935-179d-4a46-b543-0cbdf8c01244","Type":"ContainerStarted","Data":"d022ad10a914a87da0356b6e050cc6dde9b4a72430ee48ef45546bdb8db339cf"} Apr 16 14:55:18.920988 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:18.920623 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"89dc6935-179d-4a46-b543-0cbdf8c01244","Type":"ContainerStarted","Data":"5b2c235035ad8fc4d069c672a3249c09cc89b0289a9b471e11b7d4ef8e1f1bc5"} Apr 16 14:55:19.925373 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:19.925332 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"89dc6935-179d-4a46-b543-0cbdf8c01244","Type":"ContainerStarted","Data":"89a6eaa8480d44fdae261ed15d1547746408d817347305318c9b28c74085e53e"} Apr 16 14:55:19.950727 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:19.950674 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.281520068 podStartE2EDuration="5.950658613s" podCreationTimestamp="2026-04-16 14:55:14 +0000 UTC" firstStartedPulling="2026-04-16 14:55:15.553737024 +0000 UTC m=+179.894867862" lastFinishedPulling="2026-04-16 14:55:19.22287557 +0000 UTC m=+183.564006407" observedRunningTime="2026-04-16 14:55:19.948757883 +0000 UTC m=+184.289888738" watchObservedRunningTime="2026-04-16 14:55:19.950658613 +0000 UTC m=+184.291789502" Apr 16 14:55:34.322568 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:34.322504 2565 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" podUID="f23c413d-24a7-4f72-9a36-fdce46324970" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 14:55:44.322493 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:44.322453 2565 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" podUID="f23c413d-24a7-4f72-9a36-fdce46324970" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 14:55:54.321540 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:54.321499 2565 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" podUID="f23c413d-24a7-4f72-9a36-fdce46324970" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 14:55:54.321965 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:54.321576 2565 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" Apr 16 14:55:54.322083 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:54.322052 2565 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"bf4e2dd9d6d92493bcb5f4b73617662272bd60ecd8abd33e0b7d58bb2f56569a"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 14:55:54.322120 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:54.322102 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" podUID="f23c413d-24a7-4f72-9a36-fdce46324970" containerName="service-proxy" containerID="cri-o://bf4e2dd9d6d92493bcb5f4b73617662272bd60ecd8abd33e0b7d58bb2f56569a" gracePeriod=30 Apr 16 14:55:55.019670 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:55.019635 2565 generic.go:358] "Generic (PLEG): container finished" podID="f23c413d-24a7-4f72-9a36-fdce46324970" containerID="bf4e2dd9d6d92493bcb5f4b73617662272bd60ecd8abd33e0b7d58bb2f56569a" exitCode=2 Apr 16 14:55:55.019843 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:55.019697 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" event={"ID":"f23c413d-24a7-4f72-9a36-fdce46324970","Type":"ContainerDied","Data":"bf4e2dd9d6d92493bcb5f4b73617662272bd60ecd8abd33e0b7d58bb2f56569a"} Apr 16 14:55:55.019843 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:55:55.019735 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b788f87b4-5xvqb" event={"ID":"f23c413d-24a7-4f72-9a36-fdce46324970","Type":"ContainerStarted","Data":"4eed3ced729211eb4812509b05cad77fcb0c26868ede119ffb4b8281231d992f"} Apr 16 14:56:27.080408 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:27.080369 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs\") pod \"network-metrics-daemon-j76vn\" (UID: \"38d86a56-d8b6-4bb2-a413-3166ca14717f\") " pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:56:27.082716 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:27.082689 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38d86a56-d8b6-4bb2-a413-3166ca14717f-metrics-certs\") pod \"network-metrics-daemon-j76vn\" (UID: \"38d86a56-d8b6-4bb2-a413-3166ca14717f\") " pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:56:27.328919 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:27.328888 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hst6r\"" Apr 16 14:56:27.336879 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:27.336823 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j76vn" Apr 16 14:56:27.452547 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:27.452510 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-j76vn"] Apr 16 14:56:27.457240 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:56:27.457212 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38d86a56_d8b6_4bb2_a413_3166ca14717f.slice/crio-098cfa9d27240c9ed7dc1e1701589fad0c3a0c9d54984346c322197647aeb95d WatchSource:0}: Error finding container 098cfa9d27240c9ed7dc1e1701589fad0c3a0c9d54984346c322197647aeb95d: Status 404 returned error can't find the container with id 098cfa9d27240c9ed7dc1e1701589fad0c3a0c9d54984346c322197647aeb95d Apr 16 14:56:28.110755 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:28.110723 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j76vn" event={"ID":"38d86a56-d8b6-4bb2-a413-3166ca14717f","Type":"ContainerStarted","Data":"098cfa9d27240c9ed7dc1e1701589fad0c3a0c9d54984346c322197647aeb95d"} Apr 16 14:56:29.114839 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:29.114802 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j76vn" event={"ID":"38d86a56-d8b6-4bb2-a413-3166ca14717f","Type":"ContainerStarted","Data":"9ebc5caed4f9f964ea0922f10d7c52d81077dedaacd73e423001c9ae57418da1"} Apr 16 14:56:29.114839 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:29.114837 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j76vn" event={"ID":"38d86a56-d8b6-4bb2-a413-3166ca14717f","Type":"ContainerStarted","Data":"d650ef0dbabd0d2a4daa50220c90d15429321882d4d89737f7c30a0c4968a829"} Apr 16 14:56:29.129689 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:29.129642 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-j76vn" podStartSLOduration=252.199539782 podStartE2EDuration="4m13.129626612s" podCreationTimestamp="2026-04-16 14:52:16 +0000 UTC" firstStartedPulling="2026-04-16 14:56:27.458998854 +0000 UTC m=+251.800129687" lastFinishedPulling="2026-04-16 14:56:28.389085685 +0000 UTC m=+252.730216517" observedRunningTime="2026-04-16 14:56:29.127736382 +0000 UTC m=+253.468867238" watchObservedRunningTime="2026-04-16 14:56:29.129626612 +0000 UTC m=+253.470757500" Apr 16 14:56:33.602874 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:33.602826 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:56:33.603526 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:33.603376 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerName="alertmanager" containerID="cri-o://5b2c235035ad8fc4d069c672a3249c09cc89b0289a9b471e11b7d4ef8e1f1bc5" gracePeriod=120 Apr 16 14:56:33.603613 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:33.603518 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerName="kube-rbac-proxy-web" containerID="cri-o://62b54838426e63510fda98fd5f2c1400ae04227b9110cd6ca92c33e08a09ba2c" gracePeriod=120 Apr 16 14:56:33.603613 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:33.603597 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerName="kube-rbac-proxy-metric" containerID="cri-o://ab67770f80ccf9c3e174cc686ca235ee55cf7c4d5569433d0e93a5f7239520f0" gracePeriod=120 Apr 16 14:56:33.603613 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:33.603581 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerName="config-reloader" containerID="cri-o://d022ad10a914a87da0356b6e050cc6dde9b4a72430ee48ef45546bdb8db339cf" gracePeriod=120 Apr 16 14:56:33.603767 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:33.603585 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerName="prom-label-proxy" containerID="cri-o://89a6eaa8480d44fdae261ed15d1547746408d817347305318c9b28c74085e53e" gracePeriod=120 Apr 16 14:56:33.603767 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:33.603548 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerName="kube-rbac-proxy" containerID="cri-o://f8b5a918e2722949b76f784e3324baa90c1d70c7de8d250d24e7a0a349e166a2" gracePeriod=120 Apr 16 14:56:34.133854 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.133823 2565 generic.go:358] "Generic (PLEG): container finished" podID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerID="89a6eaa8480d44fdae261ed15d1547746408d817347305318c9b28c74085e53e" exitCode=0 Apr 16 14:56:34.133854 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.133850 2565 generic.go:358] "Generic (PLEG): container finished" podID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerID="ab67770f80ccf9c3e174cc686ca235ee55cf7c4d5569433d0e93a5f7239520f0" exitCode=0 Apr 16 14:56:34.133854 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.133857 2565 generic.go:358] "Generic (PLEG): container finished" podID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerID="f8b5a918e2722949b76f784e3324baa90c1d70c7de8d250d24e7a0a349e166a2" exitCode=0 Apr 16 14:56:34.133854 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.133863 2565 generic.go:358] "Generic (PLEG): container finished" podID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerID="d022ad10a914a87da0356b6e050cc6dde9b4a72430ee48ef45546bdb8db339cf" exitCode=0 Apr 16 14:56:34.133854 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.133868 2565 generic.go:358] "Generic (PLEG): container finished" podID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerID="5b2c235035ad8fc4d069c672a3249c09cc89b0289a9b471e11b7d4ef8e1f1bc5" exitCode=0 Apr 16 14:56:34.134148 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.133892 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"89dc6935-179d-4a46-b543-0cbdf8c01244","Type":"ContainerDied","Data":"89a6eaa8480d44fdae261ed15d1547746408d817347305318c9b28c74085e53e"} Apr 16 14:56:34.134148 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.133927 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"89dc6935-179d-4a46-b543-0cbdf8c01244","Type":"ContainerDied","Data":"ab67770f80ccf9c3e174cc686ca235ee55cf7c4d5569433d0e93a5f7239520f0"} Apr 16 14:56:34.134148 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.133938 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"89dc6935-179d-4a46-b543-0cbdf8c01244","Type":"ContainerDied","Data":"f8b5a918e2722949b76f784e3324baa90c1d70c7de8d250d24e7a0a349e166a2"} Apr 16 14:56:34.134148 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.133947 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"89dc6935-179d-4a46-b543-0cbdf8c01244","Type":"ContainerDied","Data":"d022ad10a914a87da0356b6e050cc6dde9b4a72430ee48ef45546bdb8db339cf"} Apr 16 14:56:34.134148 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.133957 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"89dc6935-179d-4a46-b543-0cbdf8c01244","Type":"ContainerDied","Data":"5b2c235035ad8fc4d069c672a3249c09cc89b0289a9b471e11b7d4ef8e1f1bc5"} Apr 16 14:56:34.839853 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.839830 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:34.936218 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.936103 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-secret-alertmanager-kube-rbac-proxy-web\") pod \"89dc6935-179d-4a46-b543-0cbdf8c01244\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " Apr 16 14:56:34.936218 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.936145 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89dc6935-179d-4a46-b543-0cbdf8c01244-config-out\") pod \"89dc6935-179d-4a46-b543-0cbdf8c01244\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " Apr 16 14:56:34.936218 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.936187 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89dc6935-179d-4a46-b543-0cbdf8c01244-alertmanager-trusted-ca-bundle\") pod \"89dc6935-179d-4a46-b543-0cbdf8c01244\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " Apr 16 14:56:34.936485 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.936221 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-cluster-tls-config\") pod \"89dc6935-179d-4a46-b543-0cbdf8c01244\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " Apr 16 14:56:34.936485 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.936260 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-config-volume\") pod \"89dc6935-179d-4a46-b543-0cbdf8c01244\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " Apr 16 14:56:34.936485 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.936302 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9zwk\" (UniqueName: \"kubernetes.io/projected/89dc6935-179d-4a46-b543-0cbdf8c01244-kube-api-access-q9zwk\") pod \"89dc6935-179d-4a46-b543-0cbdf8c01244\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " Apr 16 14:56:34.936485 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.936331 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-secret-alertmanager-kube-rbac-proxy-metric\") pod \"89dc6935-179d-4a46-b543-0cbdf8c01244\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " Apr 16 14:56:34.936485 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.936358 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89dc6935-179d-4a46-b543-0cbdf8c01244-tls-assets\") pod \"89dc6935-179d-4a46-b543-0cbdf8c01244\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " Apr 16 14:56:34.936485 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.936398 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-secret-alertmanager-main-tls\") pod \"89dc6935-179d-4a46-b543-0cbdf8c01244\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " Apr 16 14:56:34.936485 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.936430 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-secret-alertmanager-kube-rbac-proxy\") pod \"89dc6935-179d-4a46-b543-0cbdf8c01244\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " Apr 16 14:56:34.936485 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.936463 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/89dc6935-179d-4a46-b543-0cbdf8c01244-alertmanager-main-db\") pod \"89dc6935-179d-4a46-b543-0cbdf8c01244\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " Apr 16 14:56:34.936485 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.936487 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89dc6935-179d-4a46-b543-0cbdf8c01244-metrics-client-ca\") pod \"89dc6935-179d-4a46-b543-0cbdf8c01244\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " Apr 16 14:56:34.936923 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.936514 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-web-config\") pod \"89dc6935-179d-4a46-b543-0cbdf8c01244\" (UID: \"89dc6935-179d-4a46-b543-0cbdf8c01244\") " Apr 16 14:56:34.936923 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.936637 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89dc6935-179d-4a46-b543-0cbdf8c01244-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "89dc6935-179d-4a46-b543-0cbdf8c01244" (UID: "89dc6935-179d-4a46-b543-0cbdf8c01244"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:56:34.936923 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.936754 2565 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89dc6935-179d-4a46-b543-0cbdf8c01244-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 14:56:34.937278 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.937104 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89dc6935-179d-4a46-b543-0cbdf8c01244-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "89dc6935-179d-4a46-b543-0cbdf8c01244" (UID: "89dc6935-179d-4a46-b543-0cbdf8c01244"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:56:34.937457 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.937429 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89dc6935-179d-4a46-b543-0cbdf8c01244-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "89dc6935-179d-4a46-b543-0cbdf8c01244" (UID: "89dc6935-179d-4a46-b543-0cbdf8c01244"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:56:34.939956 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.939877 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89dc6935-179d-4a46-b543-0cbdf8c01244-config-out" (OuterVolumeSpecName: "config-out") pod "89dc6935-179d-4a46-b543-0cbdf8c01244" (UID: "89dc6935-179d-4a46-b543-0cbdf8c01244"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:56:34.939956 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.939896 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "89dc6935-179d-4a46-b543-0cbdf8c01244" (UID: "89dc6935-179d-4a46-b543-0cbdf8c01244"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:56:34.939956 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.939930 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "89dc6935-179d-4a46-b543-0cbdf8c01244" (UID: "89dc6935-179d-4a46-b543-0cbdf8c01244"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:56:34.940145 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.939953 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89dc6935-179d-4a46-b543-0cbdf8c01244-kube-api-access-q9zwk" (OuterVolumeSpecName: "kube-api-access-q9zwk") pod "89dc6935-179d-4a46-b543-0cbdf8c01244" (UID: "89dc6935-179d-4a46-b543-0cbdf8c01244"). InnerVolumeSpecName "kube-api-access-q9zwk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:56:34.940145 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.940011 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89dc6935-179d-4a46-b543-0cbdf8c01244-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "89dc6935-179d-4a46-b543-0cbdf8c01244" (UID: "89dc6935-179d-4a46-b543-0cbdf8c01244"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:56:34.940145 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.940034 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "89dc6935-179d-4a46-b543-0cbdf8c01244" (UID: "89dc6935-179d-4a46-b543-0cbdf8c01244"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:56:34.940667 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.940646 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-config-volume" (OuterVolumeSpecName: "config-volume") pod "89dc6935-179d-4a46-b543-0cbdf8c01244" (UID: "89dc6935-179d-4a46-b543-0cbdf8c01244"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:56:34.941452 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.941432 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "89dc6935-179d-4a46-b543-0cbdf8c01244" (UID: "89dc6935-179d-4a46-b543-0cbdf8c01244"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:56:34.943423 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.943344 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "89dc6935-179d-4a46-b543-0cbdf8c01244" (UID: "89dc6935-179d-4a46-b543-0cbdf8c01244"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:56:34.949407 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:34.949311 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-web-config" (OuterVolumeSpecName: "web-config") pod "89dc6935-179d-4a46-b543-0cbdf8c01244" (UID: "89dc6935-179d-4a46-b543-0cbdf8c01244"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:56:35.037873 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.037831 2565 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-cluster-tls-config\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 14:56:35.037873 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.037863 2565 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-config-volume\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 14:56:35.037873 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.037873 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q9zwk\" (UniqueName: \"kubernetes.io/projected/89dc6935-179d-4a46-b543-0cbdf8c01244-kube-api-access-q9zwk\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 14:56:35.037873 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.037884 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 14:56:35.038116 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.037894 2565 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89dc6935-179d-4a46-b543-0cbdf8c01244-tls-assets\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 14:56:35.038116 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.037905 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-secret-alertmanager-main-tls\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 14:56:35.038116 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.037914 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 14:56:35.038116 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.037923 2565 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/89dc6935-179d-4a46-b543-0cbdf8c01244-alertmanager-main-db\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 14:56:35.038116 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.037933 2565 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89dc6935-179d-4a46-b543-0cbdf8c01244-metrics-client-ca\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 14:56:35.038116 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.037941 2565 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-web-config\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 14:56:35.038116 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.037950 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/89dc6935-179d-4a46-b543-0cbdf8c01244-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 14:56:35.038116 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.037958 2565 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89dc6935-179d-4a46-b543-0cbdf8c01244-config-out\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 14:56:35.139385 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.139351 2565 generic.go:358] "Generic (PLEG): container finished" podID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerID="62b54838426e63510fda98fd5f2c1400ae04227b9110cd6ca92c33e08a09ba2c" exitCode=0 Apr 16 14:56:35.139533 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.139437 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"89dc6935-179d-4a46-b543-0cbdf8c01244","Type":"ContainerDied","Data":"62b54838426e63510fda98fd5f2c1400ae04227b9110cd6ca92c33e08a09ba2c"} Apr 16 14:56:35.139533 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.139452 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.139533 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.139479 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"89dc6935-179d-4a46-b543-0cbdf8c01244","Type":"ContainerDied","Data":"b0aaeaf13242dfecbeb2afbf76d4d197d1dc014875f956a7ac448aacd2326078"} Apr 16 14:56:35.139533 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.139497 2565 scope.go:117] "RemoveContainer" containerID="89a6eaa8480d44fdae261ed15d1547746408d817347305318c9b28c74085e53e" Apr 16 14:56:35.148373 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.148355 2565 scope.go:117] "RemoveContainer" containerID="ab67770f80ccf9c3e174cc686ca235ee55cf7c4d5569433d0e93a5f7239520f0" Apr 16 14:56:35.154601 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.154584 2565 scope.go:117] "RemoveContainer" containerID="f8b5a918e2722949b76f784e3324baa90c1d70c7de8d250d24e7a0a349e166a2" Apr 16 14:56:35.160911 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.160893 2565 scope.go:117] "RemoveContainer" containerID="62b54838426e63510fda98fd5f2c1400ae04227b9110cd6ca92c33e08a09ba2c" Apr 16 14:56:35.162264 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.162233 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:56:35.166199 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.166164 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:56:35.168105 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.168087 2565 scope.go:117] "RemoveContainer" containerID="d022ad10a914a87da0356b6e050cc6dde9b4a72430ee48ef45546bdb8db339cf" Apr 16 14:56:35.174122 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.174104 2565 scope.go:117] "RemoveContainer" containerID="5b2c235035ad8fc4d069c672a3249c09cc89b0289a9b471e11b7d4ef8e1f1bc5" Apr 16 14:56:35.180340 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.180326 2565 scope.go:117] "RemoveContainer" containerID="aa27c02c1d1623d116288a6f8ff0670cf5379e0889badd796d021bafe765fdd4" Apr 16 14:56:35.186334 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.186298 2565 scope.go:117] "RemoveContainer" containerID="89a6eaa8480d44fdae261ed15d1547746408d817347305318c9b28c74085e53e" Apr 16 14:56:35.186575 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:56:35.186556 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89a6eaa8480d44fdae261ed15d1547746408d817347305318c9b28c74085e53e\": container with ID starting with 89a6eaa8480d44fdae261ed15d1547746408d817347305318c9b28c74085e53e not found: ID does not exist" containerID="89a6eaa8480d44fdae261ed15d1547746408d817347305318c9b28c74085e53e" Apr 16 14:56:35.186627 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.186582 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89a6eaa8480d44fdae261ed15d1547746408d817347305318c9b28c74085e53e"} err="failed to get container status \"89a6eaa8480d44fdae261ed15d1547746408d817347305318c9b28c74085e53e\": rpc error: code = NotFound desc = could not find container \"89a6eaa8480d44fdae261ed15d1547746408d817347305318c9b28c74085e53e\": container with ID starting with 89a6eaa8480d44fdae261ed15d1547746408d817347305318c9b28c74085e53e not found: ID does not exist" Apr 16 14:56:35.186627 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.186599 2565 scope.go:117] "RemoveContainer" containerID="ab67770f80ccf9c3e174cc686ca235ee55cf7c4d5569433d0e93a5f7239520f0" Apr 16 14:56:35.186810 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:56:35.186793 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab67770f80ccf9c3e174cc686ca235ee55cf7c4d5569433d0e93a5f7239520f0\": container with ID starting with ab67770f80ccf9c3e174cc686ca235ee55cf7c4d5569433d0e93a5f7239520f0 not found: ID does not exist" containerID="ab67770f80ccf9c3e174cc686ca235ee55cf7c4d5569433d0e93a5f7239520f0" Apr 16 14:56:35.186848 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.186816 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab67770f80ccf9c3e174cc686ca235ee55cf7c4d5569433d0e93a5f7239520f0"} err="failed to get container status \"ab67770f80ccf9c3e174cc686ca235ee55cf7c4d5569433d0e93a5f7239520f0\": rpc error: code = NotFound desc = could not find container \"ab67770f80ccf9c3e174cc686ca235ee55cf7c4d5569433d0e93a5f7239520f0\": container with ID starting with ab67770f80ccf9c3e174cc686ca235ee55cf7c4d5569433d0e93a5f7239520f0 not found: ID does not exist" Apr 16 14:56:35.186848 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.186835 2565 scope.go:117] "RemoveContainer" containerID="f8b5a918e2722949b76f784e3324baa90c1d70c7de8d250d24e7a0a349e166a2" Apr 16 14:56:35.187076 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:56:35.187057 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8b5a918e2722949b76f784e3324baa90c1d70c7de8d250d24e7a0a349e166a2\": container with ID starting with f8b5a918e2722949b76f784e3324baa90c1d70c7de8d250d24e7a0a349e166a2 not found: ID does not exist" containerID="f8b5a918e2722949b76f784e3324baa90c1d70c7de8d250d24e7a0a349e166a2" Apr 16 14:56:35.187136 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.187078 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8b5a918e2722949b76f784e3324baa90c1d70c7de8d250d24e7a0a349e166a2"} err="failed to get container status \"f8b5a918e2722949b76f784e3324baa90c1d70c7de8d250d24e7a0a349e166a2\": rpc error: code = NotFound desc = could not find container \"f8b5a918e2722949b76f784e3324baa90c1d70c7de8d250d24e7a0a349e166a2\": container with ID starting with f8b5a918e2722949b76f784e3324baa90c1d70c7de8d250d24e7a0a349e166a2 not found: ID does not exist" Apr 16 14:56:35.187136 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.187093 2565 scope.go:117] "RemoveContainer" containerID="62b54838426e63510fda98fd5f2c1400ae04227b9110cd6ca92c33e08a09ba2c" Apr 16 14:56:35.187397 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:56:35.187381 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62b54838426e63510fda98fd5f2c1400ae04227b9110cd6ca92c33e08a09ba2c\": container with ID starting with 62b54838426e63510fda98fd5f2c1400ae04227b9110cd6ca92c33e08a09ba2c not found: ID does not exist" containerID="62b54838426e63510fda98fd5f2c1400ae04227b9110cd6ca92c33e08a09ba2c" Apr 16 14:56:35.187442 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.187403 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62b54838426e63510fda98fd5f2c1400ae04227b9110cd6ca92c33e08a09ba2c"} err="failed to get container status \"62b54838426e63510fda98fd5f2c1400ae04227b9110cd6ca92c33e08a09ba2c\": rpc error: code = NotFound desc = could not find container \"62b54838426e63510fda98fd5f2c1400ae04227b9110cd6ca92c33e08a09ba2c\": container with ID starting with 62b54838426e63510fda98fd5f2c1400ae04227b9110cd6ca92c33e08a09ba2c not found: ID does not exist" Apr 16 14:56:35.187442 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.187420 2565 scope.go:117] "RemoveContainer" containerID="d022ad10a914a87da0356b6e050cc6dde9b4a72430ee48ef45546bdb8db339cf" Apr 16 14:56:35.187668 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:56:35.187652 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d022ad10a914a87da0356b6e050cc6dde9b4a72430ee48ef45546bdb8db339cf\": container with ID starting with d022ad10a914a87da0356b6e050cc6dde9b4a72430ee48ef45546bdb8db339cf not found: ID does not exist" containerID="d022ad10a914a87da0356b6e050cc6dde9b4a72430ee48ef45546bdb8db339cf" Apr 16 14:56:35.187727 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.187670 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d022ad10a914a87da0356b6e050cc6dde9b4a72430ee48ef45546bdb8db339cf"} err="failed to get container status \"d022ad10a914a87da0356b6e050cc6dde9b4a72430ee48ef45546bdb8db339cf\": rpc error: code = NotFound desc = could not find container \"d022ad10a914a87da0356b6e050cc6dde9b4a72430ee48ef45546bdb8db339cf\": container with ID starting with d022ad10a914a87da0356b6e050cc6dde9b4a72430ee48ef45546bdb8db339cf not found: ID does not exist" Apr 16 14:56:35.187727 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.187683 2565 scope.go:117] "RemoveContainer" containerID="5b2c235035ad8fc4d069c672a3249c09cc89b0289a9b471e11b7d4ef8e1f1bc5" Apr 16 14:56:35.187906 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:56:35.187890 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b2c235035ad8fc4d069c672a3249c09cc89b0289a9b471e11b7d4ef8e1f1bc5\": container with ID starting with 5b2c235035ad8fc4d069c672a3249c09cc89b0289a9b471e11b7d4ef8e1f1bc5 not found: ID does not exist" containerID="5b2c235035ad8fc4d069c672a3249c09cc89b0289a9b471e11b7d4ef8e1f1bc5" Apr 16 14:56:35.187952 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.187910 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b2c235035ad8fc4d069c672a3249c09cc89b0289a9b471e11b7d4ef8e1f1bc5"} err="failed to get container status \"5b2c235035ad8fc4d069c672a3249c09cc89b0289a9b471e11b7d4ef8e1f1bc5\": rpc error: code = NotFound desc = could not find container \"5b2c235035ad8fc4d069c672a3249c09cc89b0289a9b471e11b7d4ef8e1f1bc5\": container with ID starting with 5b2c235035ad8fc4d069c672a3249c09cc89b0289a9b471e11b7d4ef8e1f1bc5 not found: ID does not exist" Apr 16 14:56:35.187952 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.187928 2565 scope.go:117] "RemoveContainer" containerID="aa27c02c1d1623d116288a6f8ff0670cf5379e0889badd796d021bafe765fdd4" Apr 16 14:56:35.188165 ip-10-0-142-46 kubenswrapper[2565]: E0416 14:56:35.188146 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa27c02c1d1623d116288a6f8ff0670cf5379e0889badd796d021bafe765fdd4\": container with ID starting with aa27c02c1d1623d116288a6f8ff0670cf5379e0889badd796d021bafe765fdd4 not found: ID does not exist" containerID="aa27c02c1d1623d116288a6f8ff0670cf5379e0889badd796d021bafe765fdd4" Apr 16 14:56:35.188290 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.188166 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa27c02c1d1623d116288a6f8ff0670cf5379e0889badd796d021bafe765fdd4"} err="failed to get container status \"aa27c02c1d1623d116288a6f8ff0670cf5379e0889badd796d021bafe765fdd4\": rpc error: code = NotFound desc = could not find container \"aa27c02c1d1623d116288a6f8ff0670cf5379e0889badd796d021bafe765fdd4\": container with ID starting with aa27c02c1d1623d116288a6f8ff0670cf5379e0889badd796d021bafe765fdd4 not found: ID does not exist" Apr 16 14:56:35.190311 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.190283 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:56:35.190818 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.190800 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerName="kube-rbac-proxy-metric" Apr 16 14:56:35.190899 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.190829 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerName="kube-rbac-proxy-metric" Apr 16 14:56:35.190899 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.190842 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerName="alertmanager" Apr 16 14:56:35.190899 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.190851 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerName="alertmanager" Apr 16 14:56:35.190899 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.190875 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerName="init-config-reloader" Apr 16 14:56:35.190899 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.190890 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerName="init-config-reloader" Apr 16 14:56:35.191098 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.190901 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerName="kube-rbac-proxy-web" Apr 16 14:56:35.191098 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.190909 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerName="kube-rbac-proxy-web" Apr 16 14:56:35.191098 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.190918 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerName="kube-rbac-proxy" Apr 16 14:56:35.191098 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.190926 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerName="kube-rbac-proxy" Apr 16 14:56:35.191098 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.190947 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerName="config-reloader" Apr 16 14:56:35.191098 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.190955 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerName="config-reloader" Apr 16 14:56:35.191098 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.190965 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerName="prom-label-proxy" Apr 16 14:56:35.191098 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.190975 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerName="prom-label-proxy" Apr 16 14:56:35.191098 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.191091 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerName="config-reloader" Apr 16 14:56:35.191461 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.191102 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerName="kube-rbac-proxy-web" Apr 16 14:56:35.191461 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.191120 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerName="kube-rbac-proxy" Apr 16 14:56:35.191461 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.191130 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerName="prom-label-proxy" Apr 16 14:56:35.191461 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.191140 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerName="alertmanager" Apr 16 14:56:35.191461 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.191155 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="89dc6935-179d-4a46-b543-0cbdf8c01244" containerName="kube-rbac-proxy-metric" Apr 16 14:56:35.197256 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.197238 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.199687 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.199667 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 14:56:35.199774 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.199667 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 14:56:35.199774 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.199669 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 14:56:35.199997 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.199981 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 14:56:35.200098 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.200083 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 14:56:35.200190 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.200128 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 14:56:35.200267 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.200243 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 14:56:35.200382 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.200293 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 14:56:35.200438 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.200384 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-9kvkc\"" Apr 16 14:56:35.204972 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.204921 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 14:56:35.205643 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.205624 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:56:35.340442 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.340413 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/259010e4-75f4-4aad-bf92-d9e608b8e229-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.340442 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.340444 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/259010e4-75f4-4aad-bf92-d9e608b8e229-web-config\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.340644 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.340469 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/259010e4-75f4-4aad-bf92-d9e608b8e229-config-out\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.340644 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.340497 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/259010e4-75f4-4aad-bf92-d9e608b8e229-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.340644 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.340525 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/259010e4-75f4-4aad-bf92-d9e608b8e229-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.340644 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.340552 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/259010e4-75f4-4aad-bf92-d9e608b8e229-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.340644 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.340566 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/259010e4-75f4-4aad-bf92-d9e608b8e229-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.340644 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.340584 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/259010e4-75f4-4aad-bf92-d9e608b8e229-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.340644 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.340599 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b65wj\" (UniqueName: \"kubernetes.io/projected/259010e4-75f4-4aad-bf92-d9e608b8e229-kube-api-access-b65wj\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.340644 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.340618 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/259010e4-75f4-4aad-bf92-d9e608b8e229-config-volume\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.340861 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.340675 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/259010e4-75f4-4aad-bf92-d9e608b8e229-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.340861 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.340714 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/259010e4-75f4-4aad-bf92-d9e608b8e229-tls-assets\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.340861 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.340758 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/259010e4-75f4-4aad-bf92-d9e608b8e229-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.442225 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.442085 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/259010e4-75f4-4aad-bf92-d9e608b8e229-config-out\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.442225 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.442118 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/259010e4-75f4-4aad-bf92-d9e608b8e229-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.442225 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.442146 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/259010e4-75f4-4aad-bf92-d9e608b8e229-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.442225 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.442188 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/259010e4-75f4-4aad-bf92-d9e608b8e229-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.442225 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.442215 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/259010e4-75f4-4aad-bf92-d9e608b8e229-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.442648 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.442245 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/259010e4-75f4-4aad-bf92-d9e608b8e229-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.442648 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.442274 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b65wj\" (UniqueName: \"kubernetes.io/projected/259010e4-75f4-4aad-bf92-d9e608b8e229-kube-api-access-b65wj\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.442648 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.442324 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/259010e4-75f4-4aad-bf92-d9e608b8e229-config-volume\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.442648 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.442355 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/259010e4-75f4-4aad-bf92-d9e608b8e229-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.442648 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.442384 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/259010e4-75f4-4aad-bf92-d9e608b8e229-tls-assets\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.442648 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.442419 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/259010e4-75f4-4aad-bf92-d9e608b8e229-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.442648 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.442489 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/259010e4-75f4-4aad-bf92-d9e608b8e229-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.442648 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.442541 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/259010e4-75f4-4aad-bf92-d9e608b8e229-web-config\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.445999 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.444436 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/259010e4-75f4-4aad-bf92-d9e608b8e229-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.445999 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.445129 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/259010e4-75f4-4aad-bf92-d9e608b8e229-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.445999 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.445215 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/259010e4-75f4-4aad-bf92-d9e608b8e229-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.445999 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.445221 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/259010e4-75f4-4aad-bf92-d9e608b8e229-web-config\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.445999 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.445324 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/259010e4-75f4-4aad-bf92-d9e608b8e229-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.445999 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.445385 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/259010e4-75f4-4aad-bf92-d9e608b8e229-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.445999 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.445465 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/259010e4-75f4-4aad-bf92-d9e608b8e229-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.445999 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.446054 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/259010e4-75f4-4aad-bf92-d9e608b8e229-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.447292 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.447270 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/259010e4-75f4-4aad-bf92-d9e608b8e229-config-volume\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.447489 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.447470 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/259010e4-75f4-4aad-bf92-d9e608b8e229-tls-assets\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.447540 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.447517 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/259010e4-75f4-4aad-bf92-d9e608b8e229-config-out\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.447593 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.447576 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/259010e4-75f4-4aad-bf92-d9e608b8e229-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.452465 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.452443 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b65wj\" (UniqueName: \"kubernetes.io/projected/259010e4-75f4-4aad-bf92-d9e608b8e229-kube-api-access-b65wj\") pod \"alertmanager-main-0\" (UID: \"259010e4-75f4-4aad-bf92-d9e608b8e229\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.507493 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.507459 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:56:35.630664 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:35.630631 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:56:35.632964 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:56:35.632941 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod259010e4_75f4_4aad_bf92_d9e608b8e229.slice/crio-759386df6ad0949aba7d641d9527f59a836141973ef3d9d3934197a66c80af4e WatchSource:0}: Error finding container 759386df6ad0949aba7d641d9527f59a836141973ef3d9d3934197a66c80af4e: Status 404 returned error can't find the container with id 759386df6ad0949aba7d641d9527f59a836141973ef3d9d3934197a66c80af4e Apr 16 14:56:36.144407 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:36.144318 2565 generic.go:358] "Generic (PLEG): container finished" podID="259010e4-75f4-4aad-bf92-d9e608b8e229" containerID="bac2598d9d9b531abe549752608cf1f098f78ba8337c266d286a81e0f4cf2704" exitCode=0 Apr 16 14:56:36.144407 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:36.144364 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"259010e4-75f4-4aad-bf92-d9e608b8e229","Type":"ContainerDied","Data":"bac2598d9d9b531abe549752608cf1f098f78ba8337c266d286a81e0f4cf2704"} Apr 16 14:56:36.144407 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:36.144386 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"259010e4-75f4-4aad-bf92-d9e608b8e229","Type":"ContainerStarted","Data":"759386df6ad0949aba7d641d9527f59a836141973ef3d9d3934197a66c80af4e"} Apr 16 14:56:36.327477 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:36.327450 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89dc6935-179d-4a46-b543-0cbdf8c01244" path="/var/lib/kubelet/pods/89dc6935-179d-4a46-b543-0cbdf8c01244/volumes" Apr 16 14:56:37.154106 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.154070 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"259010e4-75f4-4aad-bf92-d9e608b8e229","Type":"ContainerStarted","Data":"4ea0f67a427096da2c3975a3e6437d88cbf9db59d8710501b851f4e9e53b480e"} Apr 16 14:56:37.154106 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.154105 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"259010e4-75f4-4aad-bf92-d9e608b8e229","Type":"ContainerStarted","Data":"c8b476122ec0083bc6a41b6201dd0e16cedfbd1da2572353f9ec70eaebb78d24"} Apr 16 14:56:37.154106 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.154115 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"259010e4-75f4-4aad-bf92-d9e608b8e229","Type":"ContainerStarted","Data":"6b6445b106bb8b0bf820a2b1e580d21062339a1d27737652cde8257da9e9059c"} Apr 16 14:56:37.154546 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.154124 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"259010e4-75f4-4aad-bf92-d9e608b8e229","Type":"ContainerStarted","Data":"a7efe84985a00ae917ac21b3ca00ff0c23df6a5abb27cb5897e780858f70f9cd"} Apr 16 14:56:37.154546 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.154132 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"259010e4-75f4-4aad-bf92-d9e608b8e229","Type":"ContainerStarted","Data":"5719cecc2de1d59e4bde973d24f573bc7802f3b666634268ffafa8a963c5245d"} Apr 16 14:56:37.154546 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.154140 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"259010e4-75f4-4aad-bf92-d9e608b8e229","Type":"ContainerStarted","Data":"40c08837032e9a5ce83b94ff2c71ab1b481d5142eea74bcd06038e50fe0eb255"} Apr 16 14:56:37.180727 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.180672 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.180653091 podStartE2EDuration="2.180653091s" podCreationTimestamp="2026-04-16 14:56:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:56:37.179900439 +0000 UTC m=+261.521031294" watchObservedRunningTime="2026-04-16 14:56:37.180653091 +0000 UTC m=+261.521783946" Apr 16 14:56:37.644374 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.644289 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-968985cc8-wmt5h"] Apr 16 14:56:37.647558 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.647540 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" Apr 16 14:56:37.650224 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.650193 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-fp5vd\"" Apr 16 14:56:37.650331 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.650243 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 14:56:37.650331 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.650248 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 14:56:37.650436 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.650340 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 14:56:37.650436 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.650374 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 14:56:37.650792 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.650774 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 14:56:37.657795 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.657779 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 14:56:37.661488 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.661470 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-968985cc8-wmt5h"] Apr 16 14:56:37.758806 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.758771 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a9b599d-95ba-4f77-ba75-7b4ef3afdc51-telemeter-trusted-ca-bundle\") pod \"telemeter-client-968985cc8-wmt5h\" (UID: \"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51\") " pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" Apr 16 14:56:37.758963 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.758822 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r47tb\" (UniqueName: \"kubernetes.io/projected/9a9b599d-95ba-4f77-ba75-7b4ef3afdc51-kube-api-access-r47tb\") pod \"telemeter-client-968985cc8-wmt5h\" (UID: \"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51\") " pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" Apr 16 14:56:37.758963 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.758852 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a9b599d-95ba-4f77-ba75-7b4ef3afdc51-serving-certs-ca-bundle\") pod \"telemeter-client-968985cc8-wmt5h\" (UID: \"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51\") " pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" Apr 16 14:56:37.758963 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.758885 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/9a9b599d-95ba-4f77-ba75-7b4ef3afdc51-federate-client-tls\") pod \"telemeter-client-968985cc8-wmt5h\" (UID: \"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51\") " pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" Apr 16 14:56:37.758963 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.758904 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9a9b599d-95ba-4f77-ba75-7b4ef3afdc51-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-968985cc8-wmt5h\" (UID: \"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51\") " pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" Apr 16 14:56:37.758963 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.758920 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9a9b599d-95ba-4f77-ba75-7b4ef3afdc51-metrics-client-ca\") pod \"telemeter-client-968985cc8-wmt5h\" (UID: \"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51\") " pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" Apr 16 14:56:37.759136 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.758974 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/9a9b599d-95ba-4f77-ba75-7b4ef3afdc51-telemeter-client-tls\") pod \"telemeter-client-968985cc8-wmt5h\" (UID: \"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51\") " pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" Apr 16 14:56:37.759136 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.759001 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/9a9b599d-95ba-4f77-ba75-7b4ef3afdc51-secret-telemeter-client\") pod \"telemeter-client-968985cc8-wmt5h\" (UID: \"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51\") " pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" Apr 16 14:56:37.859581 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.859544 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a9b599d-95ba-4f77-ba75-7b4ef3afdc51-telemeter-trusted-ca-bundle\") pod \"telemeter-client-968985cc8-wmt5h\" (UID: \"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51\") " pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" Apr 16 14:56:37.859763 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.859592 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r47tb\" (UniqueName: \"kubernetes.io/projected/9a9b599d-95ba-4f77-ba75-7b4ef3afdc51-kube-api-access-r47tb\") pod \"telemeter-client-968985cc8-wmt5h\" (UID: \"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51\") " pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" Apr 16 14:56:37.859763 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.859627 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a9b599d-95ba-4f77-ba75-7b4ef3afdc51-serving-certs-ca-bundle\") pod \"telemeter-client-968985cc8-wmt5h\" (UID: \"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51\") " pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" Apr 16 14:56:37.859763 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.859651 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/9a9b599d-95ba-4f77-ba75-7b4ef3afdc51-federate-client-tls\") pod \"telemeter-client-968985cc8-wmt5h\" (UID: \"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51\") " pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" Apr 16 14:56:37.859763 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.859679 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9a9b599d-95ba-4f77-ba75-7b4ef3afdc51-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-968985cc8-wmt5h\" (UID: \"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51\") " pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" Apr 16 14:56:37.859763 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.859702 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9a9b599d-95ba-4f77-ba75-7b4ef3afdc51-metrics-client-ca\") pod \"telemeter-client-968985cc8-wmt5h\" (UID: \"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51\") " pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" Apr 16 14:56:37.860011 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.859875 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/9a9b599d-95ba-4f77-ba75-7b4ef3afdc51-telemeter-client-tls\") pod \"telemeter-client-968985cc8-wmt5h\" (UID: \"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51\") " pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" Apr 16 14:56:37.860011 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.859953 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/9a9b599d-95ba-4f77-ba75-7b4ef3afdc51-secret-telemeter-client\") pod \"telemeter-client-968985cc8-wmt5h\" (UID: \"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51\") " pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" Apr 16 14:56:37.860424 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.860397 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a9b599d-95ba-4f77-ba75-7b4ef3afdc51-serving-certs-ca-bundle\") pod \"telemeter-client-968985cc8-wmt5h\" (UID: \"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51\") " pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" Apr 16 14:56:37.860537 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.860438 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9a9b599d-95ba-4f77-ba75-7b4ef3afdc51-metrics-client-ca\") pod \"telemeter-client-968985cc8-wmt5h\" (UID: \"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51\") " pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" Apr 16 14:56:37.860710 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.860690 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a9b599d-95ba-4f77-ba75-7b4ef3afdc51-telemeter-trusted-ca-bundle\") pod \"telemeter-client-968985cc8-wmt5h\" (UID: \"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51\") " pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" Apr 16 14:56:37.862585 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.862556 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9a9b599d-95ba-4f77-ba75-7b4ef3afdc51-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-968985cc8-wmt5h\" (UID: \"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51\") " pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" Apr 16 14:56:37.862676 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.862556 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/9a9b599d-95ba-4f77-ba75-7b4ef3afdc51-telemeter-client-tls\") pod \"telemeter-client-968985cc8-wmt5h\" (UID: \"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51\") " pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" Apr 16 14:56:37.862676 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.862563 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/9a9b599d-95ba-4f77-ba75-7b4ef3afdc51-federate-client-tls\") pod \"telemeter-client-968985cc8-wmt5h\" (UID: \"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51\") " pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" Apr 16 14:56:37.862748 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.862688 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/9a9b599d-95ba-4f77-ba75-7b4ef3afdc51-secret-telemeter-client\") pod \"telemeter-client-968985cc8-wmt5h\" (UID: \"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51\") " pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" Apr 16 14:56:37.869929 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.869909 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r47tb\" (UniqueName: \"kubernetes.io/projected/9a9b599d-95ba-4f77-ba75-7b4ef3afdc51-kube-api-access-r47tb\") pod \"telemeter-client-968985cc8-wmt5h\" (UID: \"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51\") " pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" Apr 16 14:56:37.957336 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:37.957311 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" Apr 16 14:56:38.077481 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:38.077456 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-968985cc8-wmt5h"] Apr 16 14:56:38.080556 ip-10-0-142-46 kubenswrapper[2565]: W0416 14:56:38.080528 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a9b599d_95ba_4f77_ba75_7b4ef3afdc51.slice/crio-1b0f3b684382e3c5e6bba9bcabf7e6cef3254c98f55f73fe249fe64d3f6aa1a5 WatchSource:0}: Error finding container 1b0f3b684382e3c5e6bba9bcabf7e6cef3254c98f55f73fe249fe64d3f6aa1a5: Status 404 returned error can't find the container with id 1b0f3b684382e3c5e6bba9bcabf7e6cef3254c98f55f73fe249fe64d3f6aa1a5 Apr 16 14:56:38.159140 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:38.159104 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" event={"ID":"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51","Type":"ContainerStarted","Data":"1b0f3b684382e3c5e6bba9bcabf7e6cef3254c98f55f73fe249fe64d3f6aa1a5"} Apr 16 14:56:40.166957 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:40.166923 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" event={"ID":"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51","Type":"ContainerStarted","Data":"46f0eff52d61a7110d10ae185ad199d37006d470ef2ac076588ef073492264f4"} Apr 16 14:56:40.166957 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:40.166958 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" event={"ID":"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51","Type":"ContainerStarted","Data":"c6283bde43c1af1e4e07471d9c9f7e14d7aa140a2ee2da2cb083b5573fabc01f"} Apr 16 14:56:40.166957 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:40.166967 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" event={"ID":"9a9b599d-95ba-4f77-ba75-7b4ef3afdc51","Type":"ContainerStarted","Data":"bf35f3fd626d3555413c929d7f3b52bb68f53f1c30b1563ddf790b0946d8b41d"} Apr 16 14:56:40.189039 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:56:40.188980 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-968985cc8-wmt5h" podStartSLOduration=1.811041388 podStartE2EDuration="3.188966098s" podCreationTimestamp="2026-04-16 14:56:37 +0000 UTC" firstStartedPulling="2026-04-16 14:56:38.082391482 +0000 UTC m=+262.423522318" lastFinishedPulling="2026-04-16 14:56:39.460316192 +0000 UTC m=+263.801447028" observedRunningTime="2026-04-16 14:56:40.187856858 +0000 UTC m=+264.528987713" watchObservedRunningTime="2026-04-16 14:56:40.188966098 +0000 UTC m=+264.530096954" Apr 16 14:57:16.216508 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:57:16.216351 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4krl_84ace2a9-8bcc-47b5-81bb-c764aa280104/ovn-acl-logging/0.log" Apr 16 14:57:16.218363 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:57:16.218339 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4krl_84ace2a9-8bcc-47b5-81bb-c764aa280104/ovn-acl-logging/0.log" Apr 16 14:57:16.221984 ip-10-0-142-46 kubenswrapper[2565]: I0416 14:57:16.221964 2565 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 15:00:09.612523 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:09.612479 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-vhnb9"] Apr 16 15:00:09.615643 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:09.615622 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vhnb9" Apr 16 15:00:09.617976 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:09.617952 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 15:00:09.617976 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:09.617961 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 15:00:09.618321 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:09.618307 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 15:00:09.619081 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:09.619063 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-fcsr5\"" Apr 16 15:00:09.619081 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:09.619074 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 15:00:09.619257 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:09.619140 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 15:00:09.623789 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:09.623765 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-vhnb9"] Apr 16 15:00:09.727667 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:09.727605 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/c59c30f4-17b6-4db7-a4ef-a8c328b82688-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-vhnb9\" (UID: \"c59c30f4-17b6-4db7-a4ef-a8c328b82688\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vhnb9" Apr 16 15:00:09.727667 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:09.727669 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgdqw\" (UniqueName: \"kubernetes.io/projected/c59c30f4-17b6-4db7-a4ef-a8c328b82688-kube-api-access-dgdqw\") pod \"keda-metrics-apiserver-7c9f485588-vhnb9\" (UID: \"c59c30f4-17b6-4db7-a4ef-a8c328b82688\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vhnb9" Apr 16 15:00:09.727899 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:09.727710 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c59c30f4-17b6-4db7-a4ef-a8c328b82688-certificates\") pod \"keda-metrics-apiserver-7c9f485588-vhnb9\" (UID: \"c59c30f4-17b6-4db7-a4ef-a8c328b82688\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vhnb9" Apr 16 15:00:09.828618 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:09.828577 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c59c30f4-17b6-4db7-a4ef-a8c328b82688-certificates\") pod \"keda-metrics-apiserver-7c9f485588-vhnb9\" (UID: \"c59c30f4-17b6-4db7-a4ef-a8c328b82688\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vhnb9" Apr 16 15:00:09.828803 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:09.828649 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/c59c30f4-17b6-4db7-a4ef-a8c328b82688-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-vhnb9\" (UID: \"c59c30f4-17b6-4db7-a4ef-a8c328b82688\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vhnb9" Apr 16 15:00:09.828803 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:09.828672 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgdqw\" (UniqueName: \"kubernetes.io/projected/c59c30f4-17b6-4db7-a4ef-a8c328b82688-kube-api-access-dgdqw\") pod \"keda-metrics-apiserver-7c9f485588-vhnb9\" (UID: \"c59c30f4-17b6-4db7-a4ef-a8c328b82688\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vhnb9" Apr 16 15:00:09.828803 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:00:09.828726 2565 secret.go:281] references non-existent secret key: tls.crt Apr 16 15:00:09.828803 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:00:09.828745 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 15:00:09.828803 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:00:09.828761 2565 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 16 15:00:09.828803 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:00:09.828779 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-vhnb9: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 15:00:09.829015 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:00:09.828842 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c59c30f4-17b6-4db7-a4ef-a8c328b82688-certificates podName:c59c30f4-17b6-4db7-a4ef-a8c328b82688 nodeName:}" failed. No retries permitted until 2026-04-16 15:00:10.328826601 +0000 UTC m=+474.669957433 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c59c30f4-17b6-4db7-a4ef-a8c328b82688-certificates") pod "keda-metrics-apiserver-7c9f485588-vhnb9" (UID: "c59c30f4-17b6-4db7-a4ef-a8c328b82688") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 15:00:09.829062 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:09.829012 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/c59c30f4-17b6-4db7-a4ef-a8c328b82688-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-vhnb9\" (UID: \"c59c30f4-17b6-4db7-a4ef-a8c328b82688\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vhnb9" Apr 16 15:00:09.846731 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:09.846694 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgdqw\" (UniqueName: \"kubernetes.io/projected/c59c30f4-17b6-4db7-a4ef-a8c328b82688-kube-api-access-dgdqw\") pod \"keda-metrics-apiserver-7c9f485588-vhnb9\" (UID: \"c59c30f4-17b6-4db7-a4ef-a8c328b82688\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vhnb9" Apr 16 15:00:10.332570 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:10.332534 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c59c30f4-17b6-4db7-a4ef-a8c328b82688-certificates\") pod \"keda-metrics-apiserver-7c9f485588-vhnb9\" (UID: \"c59c30f4-17b6-4db7-a4ef-a8c328b82688\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vhnb9" Apr 16 15:00:10.332745 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:00:10.332678 2565 secret.go:281] references non-existent secret key: tls.crt Apr 16 15:00:10.332745 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:00:10.332695 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 15:00:10.332745 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:00:10.332716 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-vhnb9: references non-existent secret key: tls.crt Apr 16 15:00:10.332855 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:00:10.332775 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c59c30f4-17b6-4db7-a4ef-a8c328b82688-certificates podName:c59c30f4-17b6-4db7-a4ef-a8c328b82688 nodeName:}" failed. No retries permitted until 2026-04-16 15:00:11.332757703 +0000 UTC m=+475.673888536 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c59c30f4-17b6-4db7-a4ef-a8c328b82688-certificates") pod "keda-metrics-apiserver-7c9f485588-vhnb9" (UID: "c59c30f4-17b6-4db7-a4ef-a8c328b82688") : references non-existent secret key: tls.crt Apr 16 15:00:11.341760 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:11.341714 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c59c30f4-17b6-4db7-a4ef-a8c328b82688-certificates\") pod \"keda-metrics-apiserver-7c9f485588-vhnb9\" (UID: \"c59c30f4-17b6-4db7-a4ef-a8c328b82688\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vhnb9" Apr 16 15:00:11.342145 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:00:11.341868 2565 secret.go:281] references non-existent secret key: tls.crt Apr 16 15:00:11.342145 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:00:11.341887 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 15:00:11.342145 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:00:11.341905 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-vhnb9: references non-existent secret key: tls.crt Apr 16 15:00:11.342145 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:00:11.341958 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c59c30f4-17b6-4db7-a4ef-a8c328b82688-certificates podName:c59c30f4-17b6-4db7-a4ef-a8c328b82688 nodeName:}" failed. No retries permitted until 2026-04-16 15:00:13.341943578 +0000 UTC m=+477.683074411 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c59c30f4-17b6-4db7-a4ef-a8c328b82688-certificates") pod "keda-metrics-apiserver-7c9f485588-vhnb9" (UID: "c59c30f4-17b6-4db7-a4ef-a8c328b82688") : references non-existent secret key: tls.crt Apr 16 15:00:13.357687 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:13.357635 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c59c30f4-17b6-4db7-a4ef-a8c328b82688-certificates\") pod \"keda-metrics-apiserver-7c9f485588-vhnb9\" (UID: \"c59c30f4-17b6-4db7-a4ef-a8c328b82688\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vhnb9" Apr 16 15:00:13.358088 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:00:13.357784 2565 secret.go:281] references non-existent secret key: tls.crt Apr 16 15:00:13.358088 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:00:13.357805 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 15:00:13.358088 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:00:13.357823 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-vhnb9: references non-existent secret key: tls.crt Apr 16 15:00:13.358088 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:00:13.357877 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c59c30f4-17b6-4db7-a4ef-a8c328b82688-certificates podName:c59c30f4-17b6-4db7-a4ef-a8c328b82688 nodeName:}" failed. No retries permitted until 2026-04-16 15:00:17.357862554 +0000 UTC m=+481.698993393 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c59c30f4-17b6-4db7-a4ef-a8c328b82688-certificates") pod "keda-metrics-apiserver-7c9f485588-vhnb9" (UID: "c59c30f4-17b6-4db7-a4ef-a8c328b82688") : references non-existent secret key: tls.crt Apr 16 15:00:17.388987 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:17.388939 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c59c30f4-17b6-4db7-a4ef-a8c328b82688-certificates\") pod \"keda-metrics-apiserver-7c9f485588-vhnb9\" (UID: \"c59c30f4-17b6-4db7-a4ef-a8c328b82688\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vhnb9" Apr 16 15:00:17.391664 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:17.391637 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c59c30f4-17b6-4db7-a4ef-a8c328b82688-certificates\") pod \"keda-metrics-apiserver-7c9f485588-vhnb9\" (UID: \"c59c30f4-17b6-4db7-a4ef-a8c328b82688\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vhnb9" Apr 16 15:00:17.433088 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:17.433054 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-fcsr5\"" Apr 16 15:00:17.437164 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:17.437142 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vhnb9" Apr 16 15:00:17.558908 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:17.558874 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-vhnb9"] Apr 16 15:00:17.562005 ip-10-0-142-46 kubenswrapper[2565]: W0416 15:00:17.561974 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc59c30f4_17b6_4db7_a4ef_a8c328b82688.slice/crio-77c705719ae7a2ca4a46ea6546fa34e19b135861736c30b33fc41afe8620f3af WatchSource:0}: Error finding container 77c705719ae7a2ca4a46ea6546fa34e19b135861736c30b33fc41afe8620f3af: Status 404 returned error can't find the container with id 77c705719ae7a2ca4a46ea6546fa34e19b135861736c30b33fc41afe8620f3af Apr 16 15:00:17.563440 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:17.563417 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:00:17.757697 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:17.757651 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vhnb9" event={"ID":"c59c30f4-17b6-4db7-a4ef-a8c328b82688","Type":"ContainerStarted","Data":"77c705719ae7a2ca4a46ea6546fa34e19b135861736c30b33fc41afe8620f3af"} Apr 16 15:00:20.769283 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:20.769246 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vhnb9" event={"ID":"c59c30f4-17b6-4db7-a4ef-a8c328b82688","Type":"ContainerStarted","Data":"0d2921cf7656710efca96155acdf2cfa980b9b3e79c4b429a168a4051326c11d"} Apr 16 15:00:20.769666 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:20.769360 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vhnb9" Apr 16 15:00:20.783934 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:20.783875 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vhnb9" podStartSLOduration=8.789649744 podStartE2EDuration="11.783859673s" podCreationTimestamp="2026-04-16 15:00:09 +0000 UTC" firstStartedPulling="2026-04-16 15:00:17.563620756 +0000 UTC m=+481.904751592" lastFinishedPulling="2026-04-16 15:00:20.557830689 +0000 UTC m=+484.898961521" observedRunningTime="2026-04-16 15:00:20.783062104 +0000 UTC m=+485.124192958" watchObservedRunningTime="2026-04-16 15:00:20.783859673 +0000 UTC m=+485.124990527" Apr 16 15:00:31.776883 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:00:31.776848 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vhnb9" Apr 16 15:01:14.568780 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:01:14.568744 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-7cvhl"] Apr 16 15:01:14.571950 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:01:14.571924 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-7cvhl" Apr 16 15:01:14.576088 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:01:14.576058 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 15:01:14.576233 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:01:14.576077 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 15:01:14.576752 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:01:14.576734 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 15:01:14.576864 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:01:14.576776 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-qpr59\"" Apr 16 15:01:14.581686 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:01:14.581659 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-7cvhl"] Apr 16 15:01:14.638563 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:01:14.638515 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a61e3784-af03-4363-9a93-d6e2613fb991-data\") pod \"seaweedfs-86cc847c5c-7cvhl\" (UID: \"a61e3784-af03-4363-9a93-d6e2613fb991\") " pod="kserve/seaweedfs-86cc847c5c-7cvhl" Apr 16 15:01:14.638563 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:01:14.638566 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz49q\" (UniqueName: \"kubernetes.io/projected/a61e3784-af03-4363-9a93-d6e2613fb991-kube-api-access-pz49q\") pod \"seaweedfs-86cc847c5c-7cvhl\" (UID: \"a61e3784-af03-4363-9a93-d6e2613fb991\") " pod="kserve/seaweedfs-86cc847c5c-7cvhl" Apr 16 15:01:14.739418 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:01:14.739377 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a61e3784-af03-4363-9a93-d6e2613fb991-data\") pod \"seaweedfs-86cc847c5c-7cvhl\" (UID: \"a61e3784-af03-4363-9a93-d6e2613fb991\") " pod="kserve/seaweedfs-86cc847c5c-7cvhl" Apr 16 15:01:14.739550 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:01:14.739425 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pz49q\" (UniqueName: \"kubernetes.io/projected/a61e3784-af03-4363-9a93-d6e2613fb991-kube-api-access-pz49q\") pod \"seaweedfs-86cc847c5c-7cvhl\" (UID: \"a61e3784-af03-4363-9a93-d6e2613fb991\") " pod="kserve/seaweedfs-86cc847c5c-7cvhl" Apr 16 15:01:14.739843 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:01:14.739821 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a61e3784-af03-4363-9a93-d6e2613fb991-data\") pod \"seaweedfs-86cc847c5c-7cvhl\" (UID: \"a61e3784-af03-4363-9a93-d6e2613fb991\") " pod="kserve/seaweedfs-86cc847c5c-7cvhl" Apr 16 15:01:14.749739 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:01:14.749698 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz49q\" (UniqueName: \"kubernetes.io/projected/a61e3784-af03-4363-9a93-d6e2613fb991-kube-api-access-pz49q\") pod \"seaweedfs-86cc847c5c-7cvhl\" (UID: \"a61e3784-af03-4363-9a93-d6e2613fb991\") " pod="kserve/seaweedfs-86cc847c5c-7cvhl" Apr 16 15:01:14.880961 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:01:14.880868 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-7cvhl" Apr 16 15:01:15.009813 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:01:15.009777 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-7cvhl"] Apr 16 15:01:15.013623 ip-10-0-142-46 kubenswrapper[2565]: W0416 15:01:15.013585 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda61e3784_af03_4363_9a93_d6e2613fb991.slice/crio-2bc686cf546e808a2a91fef8542c349e6ea1ac80306c985b5bfaeca1e9a1ac4e WatchSource:0}: Error finding container 2bc686cf546e808a2a91fef8542c349e6ea1ac80306c985b5bfaeca1e9a1ac4e: Status 404 returned error can't find the container with id 2bc686cf546e808a2a91fef8542c349e6ea1ac80306c985b5bfaeca1e9a1ac4e Apr 16 15:01:15.932207 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:01:15.932148 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-7cvhl" event={"ID":"a61e3784-af03-4363-9a93-d6e2613fb991","Type":"ContainerStarted","Data":"2bc686cf546e808a2a91fef8542c349e6ea1ac80306c985b5bfaeca1e9a1ac4e"} Apr 16 15:01:17.802377 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:01:17.802351 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 15:01:17.939119 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:01:17.939079 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-7cvhl" event={"ID":"a61e3784-af03-4363-9a93-d6e2613fb991","Type":"ContainerStarted","Data":"7aab27d961b342a2271fc3d20bf094d9990bedb114ee3c7eed0d3d2f4e7f5e2e"} Apr 16 15:01:17.939363 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:01:17.939227 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-7cvhl" Apr 16 15:01:17.953891 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:01:17.953839 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-7cvhl" podStartSLOduration=1.168602267 podStartE2EDuration="3.953822939s" podCreationTimestamp="2026-04-16 15:01:14 +0000 UTC" firstStartedPulling="2026-04-16 15:01:15.015021163 +0000 UTC m=+539.356152014" lastFinishedPulling="2026-04-16 15:01:17.800241852 +0000 UTC m=+542.141372686" observedRunningTime="2026-04-16 15:01:17.952204108 +0000 UTC m=+542.293334960" watchObservedRunningTime="2026-04-16 15:01:17.953822939 +0000 UTC m=+542.294953794" Apr 16 15:01:23.944855 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:01:23.944825 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-7cvhl" Apr 16 15:02:16.240582 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:02:16.240551 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4krl_84ace2a9-8bcc-47b5-81bb-c764aa280104/ovn-acl-logging/0.log" Apr 16 15:02:16.241088 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:02:16.240632 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4krl_84ace2a9-8bcc-47b5-81bb-c764aa280104/ovn-acl-logging/0.log" Apr 16 15:02:38.923355 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:02:38.923270 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-f4zjc"] Apr 16 15:02:38.926514 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:02:38.926491 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-f4zjc" Apr 16 15:02:38.928715 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:02:38.928693 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 15:02:38.928821 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:02:38.928781 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-6mjht\"" Apr 16 15:02:38.935744 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:02:38.935719 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-f4zjc"] Apr 16 15:02:38.988082 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:02:38.988049 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/230f084b-c0f2-4810-894d-b3dc1ebf1291-tls-certs\") pod \"model-serving-api-86f7b4b499-f4zjc\" (UID: \"230f084b-c0f2-4810-894d-b3dc1ebf1291\") " pod="kserve/model-serving-api-86f7b4b499-f4zjc" Apr 16 15:02:38.988294 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:02:38.988118 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsdnv\" (UniqueName: \"kubernetes.io/projected/230f084b-c0f2-4810-894d-b3dc1ebf1291-kube-api-access-dsdnv\") pod \"model-serving-api-86f7b4b499-f4zjc\" (UID: \"230f084b-c0f2-4810-894d-b3dc1ebf1291\") " pod="kserve/model-serving-api-86f7b4b499-f4zjc" Apr 16 15:02:39.089270 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:02:39.089243 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/230f084b-c0f2-4810-894d-b3dc1ebf1291-tls-certs\") pod \"model-serving-api-86f7b4b499-f4zjc\" (UID: \"230f084b-c0f2-4810-894d-b3dc1ebf1291\") " pod="kserve/model-serving-api-86f7b4b499-f4zjc" Apr 16 15:02:39.089434 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:02:39.089418 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsdnv\" (UniqueName: \"kubernetes.io/projected/230f084b-c0f2-4810-894d-b3dc1ebf1291-kube-api-access-dsdnv\") pod \"model-serving-api-86f7b4b499-f4zjc\" (UID: \"230f084b-c0f2-4810-894d-b3dc1ebf1291\") " pod="kserve/model-serving-api-86f7b4b499-f4zjc" Apr 16 15:02:39.091683 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:02:39.091661 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/230f084b-c0f2-4810-894d-b3dc1ebf1291-tls-certs\") pod \"model-serving-api-86f7b4b499-f4zjc\" (UID: \"230f084b-c0f2-4810-894d-b3dc1ebf1291\") " pod="kserve/model-serving-api-86f7b4b499-f4zjc" Apr 16 15:02:39.097137 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:02:39.097114 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsdnv\" (UniqueName: \"kubernetes.io/projected/230f084b-c0f2-4810-894d-b3dc1ebf1291-kube-api-access-dsdnv\") pod \"model-serving-api-86f7b4b499-f4zjc\" (UID: \"230f084b-c0f2-4810-894d-b3dc1ebf1291\") " pod="kserve/model-serving-api-86f7b4b499-f4zjc" Apr 16 15:02:39.237819 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:02:39.237730 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-f4zjc" Apr 16 15:02:39.356775 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:02:39.356737 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-f4zjc"] Apr 16 15:02:39.359721 ip-10-0-142-46 kubenswrapper[2565]: W0416 15:02:39.359691 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod230f084b_c0f2_4810_894d_b3dc1ebf1291.slice/crio-98119b9ed63eb2b5149c48ea81c0a222311cd136b92a03bd9999eab4dd7a5cce WatchSource:0}: Error finding container 98119b9ed63eb2b5149c48ea81c0a222311cd136b92a03bd9999eab4dd7a5cce: Status 404 returned error can't find the container with id 98119b9ed63eb2b5149c48ea81c0a222311cd136b92a03bd9999eab4dd7a5cce Apr 16 15:02:40.164629 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:02:40.164584 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-f4zjc" event={"ID":"230f084b-c0f2-4810-894d-b3dc1ebf1291","Type":"ContainerStarted","Data":"98119b9ed63eb2b5149c48ea81c0a222311cd136b92a03bd9999eab4dd7a5cce"} Apr 16 15:02:42.172439 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:02:42.172408 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-f4zjc" event={"ID":"230f084b-c0f2-4810-894d-b3dc1ebf1291","Type":"ContainerStarted","Data":"19e57bbf92a6cc02a74a1af6494dda126dc05a2426a3b14413d1caf966f595ff"} Apr 16 15:02:42.172820 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:02:42.172465 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-f4zjc" Apr 16 15:02:42.205567 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:02:42.205523 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-f4zjc" podStartSLOduration=1.893024784 podStartE2EDuration="4.205506102s" podCreationTimestamp="2026-04-16 15:02:38 +0000 UTC" firstStartedPulling="2026-04-16 15:02:39.361388771 +0000 UTC m=+623.702519607" lastFinishedPulling="2026-04-16 15:02:41.673870092 +0000 UTC m=+626.015000925" observedRunningTime="2026-04-16 15:02:42.20530388 +0000 UTC m=+626.546434734" watchObservedRunningTime="2026-04-16 15:02:42.205506102 +0000 UTC m=+626.546636959" Apr 16 15:02:53.180486 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:02:53.180457 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-f4zjc" Apr 16 15:03:13.554924 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:03:13.554889 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2"] Apr 16 15:03:13.558281 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:03:13.558263 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" Apr 16 15:03:13.560195 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:03:13.560157 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-l2z86\"" Apr 16 15:03:13.564732 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:03:13.564706 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2"] Apr 16 15:03:13.681136 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:03:13.681097 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e64adc3-bb32-4063-9c38-b687be03f0da-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2\" (UID: \"4e64adc3-bb32-4063-9c38-b687be03f0da\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" Apr 16 15:03:13.781486 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:03:13.781448 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e64adc3-bb32-4063-9c38-b687be03f0da-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2\" (UID: \"4e64adc3-bb32-4063-9c38-b687be03f0da\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" Apr 16 15:03:13.781815 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:03:13.781794 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e64adc3-bb32-4063-9c38-b687be03f0da-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2\" (UID: \"4e64adc3-bb32-4063-9c38-b687be03f0da\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" Apr 16 15:03:13.869115 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:03:13.869029 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" Apr 16 15:03:14.026240 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:03:14.026134 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2"] Apr 16 15:03:14.028369 ip-10-0-142-46 kubenswrapper[2565]: W0416 15:03:14.028340 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e64adc3_bb32_4063_9c38_b687be03f0da.slice/crio-5bb5db4acd304d9a80b73ce1602bf073a92f89bd3e681c37cdd0c6307bf1cc3f WatchSource:0}: Error finding container 5bb5db4acd304d9a80b73ce1602bf073a92f89bd3e681c37cdd0c6307bf1cc3f: Status 404 returned error can't find the container with id 5bb5db4acd304d9a80b73ce1602bf073a92f89bd3e681c37cdd0c6307bf1cc3f Apr 16 15:03:14.261013 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:03:14.260977 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" event={"ID":"4e64adc3-bb32-4063-9c38-b687be03f0da","Type":"ContainerStarted","Data":"5bb5db4acd304d9a80b73ce1602bf073a92f89bd3e681c37cdd0c6307bf1cc3f"} Apr 16 15:03:19.279183 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:03:19.279144 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" event={"ID":"4e64adc3-bb32-4063-9c38-b687be03f0da","Type":"ContainerStarted","Data":"82794ff9f3d18facb6b2d4f6cab6821dd223327c8f562f0adc9507f8478e1795"} Apr 16 15:03:23.292373 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:03:23.292339 2565 generic.go:358] "Generic (PLEG): container finished" podID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerID="82794ff9f3d18facb6b2d4f6cab6821dd223327c8f562f0adc9507f8478e1795" exitCode=0 Apr 16 15:03:23.292373 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:03:23.292377 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" event={"ID":"4e64adc3-bb32-4063-9c38-b687be03f0da","Type":"ContainerDied","Data":"82794ff9f3d18facb6b2d4f6cab6821dd223327c8f562f0adc9507f8478e1795"} Apr 16 15:03:38.344776 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:03:38.344741 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" event={"ID":"4e64adc3-bb32-4063-9c38-b687be03f0da","Type":"ContainerStarted","Data":"f2b261e3393c9123b294ca44fb00b91bc10e1f4ab6d277f61cc7494c01e708b4"} Apr 16 15:03:41.353860 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:03:41.353827 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" event={"ID":"4e64adc3-bb32-4063-9c38-b687be03f0da","Type":"ContainerStarted","Data":"5dadaae9a074a84af7c79c61800c4136d311de7711f4dd54e76e58d11fc7336b"} Apr 16 15:03:41.354268 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:03:41.354065 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" Apr 16 15:03:41.355362 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:03:41.355336 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 16 15:03:41.371000 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:03:41.370735 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" podStartSLOduration=1.142809833 podStartE2EDuration="28.370720365s" podCreationTimestamp="2026-04-16 15:03:13 +0000 UTC" firstStartedPulling="2026-04-16 15:03:14.030021883 +0000 UTC m=+658.371152719" lastFinishedPulling="2026-04-16 15:03:41.257932408 +0000 UTC m=+685.599063251" observedRunningTime="2026-04-16 15:03:41.370043339 +0000 UTC m=+685.711174193" watchObservedRunningTime="2026-04-16 15:03:41.370720365 +0000 UTC m=+685.711851219" Apr 16 15:03:42.356903 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:03:42.356849 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" Apr 16 15:03:42.357371 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:03:42.356983 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 16 15:03:42.357943 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:03:42.357921 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:03:43.359873 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:03:43.359834 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 16 15:03:43.360395 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:03:43.360227 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:03:53.360618 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:03:53.360515 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 16 15:03:53.361061 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:03:53.360972 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:04:03.359960 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:03.359913 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 16 15:04:03.360459 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:03.360438 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:04:13.360551 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:13.360506 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 16 15:04:13.361027 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:13.360989 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:04:23.359857 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:23.359813 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 16 15:04:23.360431 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:23.360286 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:04:33.360006 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:33.359963 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 16 15:04:33.360505 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:33.360403 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:04:43.360375 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:43.360339 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" Apr 16 15:04:43.360876 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:43.360769 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" Apr 16 15:04:58.712736 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:58.712704 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2"] Apr 16 15:04:58.713249 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:58.713020 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="kserve-container" containerID="cri-o://f2b261e3393c9123b294ca44fb00b91bc10e1f4ab6d277f61cc7494c01e708b4" gracePeriod=30 Apr 16 15:04:58.713249 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:58.713123 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="agent" containerID="cri-o://5dadaae9a074a84af7c79c61800c4136d311de7711f4dd54e76e58d11fc7336b" gracePeriod=30 Apr 16 15:04:58.831263 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:58.831234 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78"] Apr 16 15:04:58.833495 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:58.833477 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78" Apr 16 15:04:58.843860 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:58.843834 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78"] Apr 16 15:04:58.873984 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:58.873950 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k"] Apr 16 15:04:58.876384 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:58.876364 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k" Apr 16 15:04:58.886076 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:58.886053 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k"] Apr 16 15:04:58.926210 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:58.926165 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01e75b68-a5d7-41a3-b185-020a24ed1293-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k\" (UID: \"01e75b68-a5d7-41a3-b185-020a24ed1293\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k" Apr 16 15:04:58.926392 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:58.926233 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/698398ad-669d-4a53-b8ce-c9e4d415cd4a-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78\" (UID: \"698398ad-669d-4a53-b8ce-c9e4d415cd4a\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78" Apr 16 15:04:59.027571 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:59.027482 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01e75b68-a5d7-41a3-b185-020a24ed1293-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k\" (UID: \"01e75b68-a5d7-41a3-b185-020a24ed1293\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k" Apr 16 15:04:59.027571 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:59.027525 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/698398ad-669d-4a53-b8ce-c9e4d415cd4a-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78\" (UID: \"698398ad-669d-4a53-b8ce-c9e4d415cd4a\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78" Apr 16 15:04:59.027959 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:59.027935 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/698398ad-669d-4a53-b8ce-c9e4d415cd4a-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78\" (UID: \"698398ad-669d-4a53-b8ce-c9e4d415cd4a\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78" Apr 16 15:04:59.027959 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:59.027949 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01e75b68-a5d7-41a3-b185-020a24ed1293-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k\" (UID: \"01e75b68-a5d7-41a3-b185-020a24ed1293\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k" Apr 16 15:04:59.143676 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:59.143644 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78" Apr 16 15:04:59.186606 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:59.186570 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k" Apr 16 15:04:59.272269 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:59.272227 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78"] Apr 16 15:04:59.274440 ip-10-0-142-46 kubenswrapper[2565]: W0416 15:04:59.274410 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod698398ad_669d_4a53_b8ce_c9e4d415cd4a.slice/crio-a7c6f007af9393d687ba4610a6142f099fc4d11d1bec00de37694b88dbf327a4 WatchSource:0}: Error finding container a7c6f007af9393d687ba4610a6142f099fc4d11d1bec00de37694b88dbf327a4: Status 404 returned error can't find the container with id a7c6f007af9393d687ba4610a6142f099fc4d11d1bec00de37694b88dbf327a4 Apr 16 15:04:59.324680 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:59.324653 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k"] Apr 16 15:04:59.327489 ip-10-0-142-46 kubenswrapper[2565]: W0416 15:04:59.327464 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01e75b68_a5d7_41a3_b185_020a24ed1293.slice/crio-5b1a4675adbfce59286bb3de4528a296771f28edacdfbb58bba1bf6bec4cfcfd WatchSource:0}: Error finding container 5b1a4675adbfce59286bb3de4528a296771f28edacdfbb58bba1bf6bec4cfcfd: Status 404 returned error can't find the container with id 5b1a4675adbfce59286bb3de4528a296771f28edacdfbb58bba1bf6bec4cfcfd Apr 16 15:04:59.570036 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:59.569929 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k" event={"ID":"01e75b68-a5d7-41a3-b185-020a24ed1293","Type":"ContainerStarted","Data":"b120d09be596f892029fda287e406981fbc7b6671d45e29b43e9c2945618cbcd"} Apr 16 15:04:59.570036 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:59.569979 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k" event={"ID":"01e75b68-a5d7-41a3-b185-020a24ed1293","Type":"ContainerStarted","Data":"5b1a4675adbfce59286bb3de4528a296771f28edacdfbb58bba1bf6bec4cfcfd"} Apr 16 15:04:59.571475 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:59.571446 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78" event={"ID":"698398ad-669d-4a53-b8ce-c9e4d415cd4a","Type":"ContainerStarted","Data":"93c0f1d7481469399613795f4ea7aa03b9a1adb04d6d4e2873610552508ca75a"} Apr 16 15:04:59.571576 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:04:59.571484 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78" event={"ID":"698398ad-669d-4a53-b8ce-c9e4d415cd4a","Type":"ContainerStarted","Data":"a7c6f007af9393d687ba4610a6142f099fc4d11d1bec00de37694b88dbf327a4"} Apr 16 15:05:03.360788 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:03.360691 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 16 15:05:03.361351 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:03.361325 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:05:03.585118 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:03.585080 2565 generic.go:358] "Generic (PLEG): container finished" podID="01e75b68-a5d7-41a3-b185-020a24ed1293" containerID="b120d09be596f892029fda287e406981fbc7b6671d45e29b43e9c2945618cbcd" exitCode=0 Apr 16 15:05:03.585313 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:03.585130 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k" event={"ID":"01e75b68-a5d7-41a3-b185-020a24ed1293","Type":"ContainerDied","Data":"b120d09be596f892029fda287e406981fbc7b6671d45e29b43e9c2945618cbcd"} Apr 16 15:05:03.586610 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:03.586589 2565 generic.go:358] "Generic (PLEG): container finished" podID="698398ad-669d-4a53-b8ce-c9e4d415cd4a" containerID="93c0f1d7481469399613795f4ea7aa03b9a1adb04d6d4e2873610552508ca75a" exitCode=0 Apr 16 15:05:03.586708 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:03.586677 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78" event={"ID":"698398ad-669d-4a53-b8ce-c9e4d415cd4a","Type":"ContainerDied","Data":"93c0f1d7481469399613795f4ea7aa03b9a1adb04d6d4e2873610552508ca75a"} Apr 16 15:05:03.588646 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:03.588626 2565 generic.go:358] "Generic (PLEG): container finished" podID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerID="f2b261e3393c9123b294ca44fb00b91bc10e1f4ab6d277f61cc7494c01e708b4" exitCode=0 Apr 16 15:05:03.588746 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:03.588675 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" event={"ID":"4e64adc3-bb32-4063-9c38-b687be03f0da","Type":"ContainerDied","Data":"f2b261e3393c9123b294ca44fb00b91bc10e1f4ab6d277f61cc7494c01e708b4"} Apr 16 15:05:04.594431 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:04.594398 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78" event={"ID":"698398ad-669d-4a53-b8ce-c9e4d415cd4a","Type":"ContainerStarted","Data":"a9ec1b206c8edd67442bbbf7df8ebeff7fb76421cd589608562ae33912eb182d"} Apr 16 15:05:04.595005 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:04.594746 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78" Apr 16 15:05:04.596148 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:04.596083 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78" podUID="698398ad-669d-4a53-b8ce-c9e4d415cd4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 16 15:05:04.612304 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:04.612032 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78" podStartSLOduration=6.612010679 podStartE2EDuration="6.612010679s" podCreationTimestamp="2026-04-16 15:04:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:05:04.608250663 +0000 UTC m=+768.949381521" watchObservedRunningTime="2026-04-16 15:05:04.612010679 +0000 UTC m=+768.953141512" Apr 16 15:05:05.599002 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:05.598638 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78" podUID="698398ad-669d-4a53-b8ce-c9e4d415cd4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 16 15:05:13.360463 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:13.360423 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 16 15:05:13.361299 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:13.361269 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:05:15.599075 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:15.599027 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78" podUID="698398ad-669d-4a53-b8ce-c9e4d415cd4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 16 15:05:23.360598 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:23.360555 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 16 15:05:23.361066 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:23.360710 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" Apr 16 15:05:23.361457 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:23.361428 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:05:23.361591 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:23.361533 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" Apr 16 15:05:25.599331 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:25.599247 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78" podUID="698398ad-669d-4a53-b8ce-c9e4d415cd4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 16 15:05:25.662924 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:25.662886 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k" event={"ID":"01e75b68-a5d7-41a3-b185-020a24ed1293","Type":"ContainerStarted","Data":"42317f9e7eacb8dddb7ad64a60da6e385a0eeab1496c311a09f072ad21441b89"} Apr 16 15:05:25.663246 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:25.663228 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k" Apr 16 15:05:25.664392 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:25.664367 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k" podUID="01e75b68-a5d7-41a3-b185-020a24ed1293" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 16 15:05:25.678161 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:25.678112 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k" podStartSLOduration=5.958902 podStartE2EDuration="27.678096261s" podCreationTimestamp="2026-04-16 15:04:58 +0000 UTC" firstStartedPulling="2026-04-16 15:05:03.586506598 +0000 UTC m=+767.927637435" lastFinishedPulling="2026-04-16 15:05:25.30570086 +0000 UTC m=+789.646831696" observedRunningTime="2026-04-16 15:05:25.676654704 +0000 UTC m=+790.017785560" watchObservedRunningTime="2026-04-16 15:05:25.678096261 +0000 UTC m=+790.019227117" Apr 16 15:05:26.666398 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:26.666362 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k" podUID="01e75b68-a5d7-41a3-b185-020a24ed1293" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 16 15:05:29.369374 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:29.369351 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" Apr 16 15:05:29.489063 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:29.489025 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e64adc3-bb32-4063-9c38-b687be03f0da-kserve-provision-location\") pod \"4e64adc3-bb32-4063-9c38-b687be03f0da\" (UID: \"4e64adc3-bb32-4063-9c38-b687be03f0da\") " Apr 16 15:05:29.489429 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:29.489400 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e64adc3-bb32-4063-9c38-b687be03f0da-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4e64adc3-bb32-4063-9c38-b687be03f0da" (UID: "4e64adc3-bb32-4063-9c38-b687be03f0da"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:05:29.590132 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:29.590051 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e64adc3-bb32-4063-9c38-b687be03f0da-kserve-provision-location\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 15:05:29.676743 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:29.676706 2565 generic.go:358] "Generic (PLEG): container finished" podID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerID="5dadaae9a074a84af7c79c61800c4136d311de7711f4dd54e76e58d11fc7336b" exitCode=0 Apr 16 15:05:29.676905 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:29.676790 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" Apr 16 15:05:29.676905 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:29.676786 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" event={"ID":"4e64adc3-bb32-4063-9c38-b687be03f0da","Type":"ContainerDied","Data":"5dadaae9a074a84af7c79c61800c4136d311de7711f4dd54e76e58d11fc7336b"} Apr 16 15:05:29.676905 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:29.676893 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2" event={"ID":"4e64adc3-bb32-4063-9c38-b687be03f0da","Type":"ContainerDied","Data":"5bb5db4acd304d9a80b73ce1602bf073a92f89bd3e681c37cdd0c6307bf1cc3f"} Apr 16 15:05:29.677002 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:29.676909 2565 scope.go:117] "RemoveContainer" containerID="5dadaae9a074a84af7c79c61800c4136d311de7711f4dd54e76e58d11fc7336b" Apr 16 15:05:29.684725 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:29.684709 2565 scope.go:117] "RemoveContainer" containerID="f2b261e3393c9123b294ca44fb00b91bc10e1f4ab6d277f61cc7494c01e708b4" Apr 16 15:05:29.691838 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:29.691816 2565 scope.go:117] "RemoveContainer" containerID="82794ff9f3d18facb6b2d4f6cab6821dd223327c8f562f0adc9507f8478e1795" Apr 16 15:05:29.697460 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:29.697440 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2"] Apr 16 15:05:29.699107 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:29.699079 2565 scope.go:117] "RemoveContainer" containerID="5dadaae9a074a84af7c79c61800c4136d311de7711f4dd54e76e58d11fc7336b" Apr 16 15:05:29.699536 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:05:29.699434 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dadaae9a074a84af7c79c61800c4136d311de7711f4dd54e76e58d11fc7336b\": container with ID starting with 5dadaae9a074a84af7c79c61800c4136d311de7711f4dd54e76e58d11fc7336b not found: ID does not exist" containerID="5dadaae9a074a84af7c79c61800c4136d311de7711f4dd54e76e58d11fc7336b" Apr 16 15:05:29.699536 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:29.699469 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dadaae9a074a84af7c79c61800c4136d311de7711f4dd54e76e58d11fc7336b"} err="failed to get container status \"5dadaae9a074a84af7c79c61800c4136d311de7711f4dd54e76e58d11fc7336b\": rpc error: code = NotFound desc = could not find container \"5dadaae9a074a84af7c79c61800c4136d311de7711f4dd54e76e58d11fc7336b\": container with ID starting with 5dadaae9a074a84af7c79c61800c4136d311de7711f4dd54e76e58d11fc7336b not found: ID does not exist" Apr 16 15:05:29.699536 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:29.699493 2565 scope.go:117] "RemoveContainer" containerID="f2b261e3393c9123b294ca44fb00b91bc10e1f4ab6d277f61cc7494c01e708b4" Apr 16 15:05:29.699793 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:05:29.699772 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2b261e3393c9123b294ca44fb00b91bc10e1f4ab6d277f61cc7494c01e708b4\": container with ID starting with f2b261e3393c9123b294ca44fb00b91bc10e1f4ab6d277f61cc7494c01e708b4 not found: ID does not exist" containerID="f2b261e3393c9123b294ca44fb00b91bc10e1f4ab6d277f61cc7494c01e708b4" Apr 16 15:05:29.699861 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:29.699802 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2b261e3393c9123b294ca44fb00b91bc10e1f4ab6d277f61cc7494c01e708b4"} err="failed to get container status \"f2b261e3393c9123b294ca44fb00b91bc10e1f4ab6d277f61cc7494c01e708b4\": rpc error: code = NotFound desc = could not find container \"f2b261e3393c9123b294ca44fb00b91bc10e1f4ab6d277f61cc7494c01e708b4\": container with ID starting with f2b261e3393c9123b294ca44fb00b91bc10e1f4ab6d277f61cc7494c01e708b4 not found: ID does not exist" Apr 16 15:05:29.699861 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:29.699825 2565 scope.go:117] "RemoveContainer" containerID="82794ff9f3d18facb6b2d4f6cab6821dd223327c8f562f0adc9507f8478e1795" Apr 16 15:05:29.700105 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:05:29.700089 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82794ff9f3d18facb6b2d4f6cab6821dd223327c8f562f0adc9507f8478e1795\": container with ID starting with 82794ff9f3d18facb6b2d4f6cab6821dd223327c8f562f0adc9507f8478e1795 not found: ID does not exist" containerID="82794ff9f3d18facb6b2d4f6cab6821dd223327c8f562f0adc9507f8478e1795" Apr 16 15:05:29.700146 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:29.700110 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82794ff9f3d18facb6b2d4f6cab6821dd223327c8f562f0adc9507f8478e1795"} err="failed to get container status \"82794ff9f3d18facb6b2d4f6cab6821dd223327c8f562f0adc9507f8478e1795\": rpc error: code = NotFound desc = could not find container \"82794ff9f3d18facb6b2d4f6cab6821dd223327c8f562f0adc9507f8478e1795\": container with ID starting with 82794ff9f3d18facb6b2d4f6cab6821dd223327c8f562f0adc9507f8478e1795 not found: ID does not exist" Apr 16 15:05:29.705478 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:29.701690 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-f99a6-predictor-698b5c9d6b-xn5s2"] Apr 16 15:05:30.326773 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:30.326731 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" path="/var/lib/kubelet/pods/4e64adc3-bb32-4063-9c38-b687be03f0da/volumes" Apr 16 15:05:35.599063 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:35.598960 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78" podUID="698398ad-669d-4a53-b8ce-c9e4d415cd4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 16 15:05:36.666676 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:36.666629 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k" podUID="01e75b68-a5d7-41a3-b185-020a24ed1293" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 16 15:05:45.599525 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:45.599473 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78" podUID="698398ad-669d-4a53-b8ce-c9e4d415cd4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 16 15:05:46.667433 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:46.667392 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k" podUID="01e75b68-a5d7-41a3-b185-020a24ed1293" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 16 15:05:55.599530 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:55.599482 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78" podUID="698398ad-669d-4a53-b8ce-c9e4d415cd4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 16 15:05:56.666957 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:05:56.666914 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k" podUID="01e75b68-a5d7-41a3-b185-020a24ed1293" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 16 15:06:05.598995 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:05.598955 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78" podUID="698398ad-669d-4a53-b8ce-c9e4d415cd4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 16 15:06:06.667262 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:06.667217 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k" podUID="01e75b68-a5d7-41a3-b185-020a24ed1293" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 16 15:06:08.323183 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:08.323128 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78" podUID="698398ad-669d-4a53-b8ce-c9e4d415cd4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 16 15:06:16.666862 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:16.666813 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k" podUID="01e75b68-a5d7-41a3-b185-020a24ed1293" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 16 15:06:18.326552 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:18.326523 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78" Apr 16 15:06:26.667448 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:26.667416 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k" Apr 16 15:06:28.793422 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:28.793384 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw"] Apr 16 15:06:28.793866 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:28.793836 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="kserve-container" Apr 16 15:06:28.793866 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:28.793854 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="kserve-container" Apr 16 15:06:28.793984 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:28.793881 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="storage-initializer" Apr 16 15:06:28.793984 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:28.793890 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="storage-initializer" Apr 16 15:06:28.793984 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:28.793900 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="agent" Apr 16 15:06:28.793984 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:28.793928 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="agent" Apr 16 15:06:28.794197 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:28.794017 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="agent" Apr 16 15:06:28.794197 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:28.794030 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e64adc3-bb32-4063-9c38-b687be03f0da" containerName="kserve-container" Apr 16 15:06:28.796824 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:28.796802 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw" Apr 16 15:06:28.798889 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:28.798861 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-0fa3d-kube-rbac-proxy-sar-config\"" Apr 16 15:06:28.799008 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:28.798909 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 15:06:28.799008 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:28.798941 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-0fa3d-serving-cert\"" Apr 16 15:06:28.803352 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:28.803329 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw"] Apr 16 15:06:28.861511 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:28.861478 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55501623-1e85-497c-8e95-44fd6fbfbe0c-openshift-service-ca-bundle\") pod \"model-chainer-raw-0fa3d-65695c4586-f64qw\" (UID: \"55501623-1e85-497c-8e95-44fd6fbfbe0c\") " pod="kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw" Apr 16 15:06:28.861666 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:28.861536 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55501623-1e85-497c-8e95-44fd6fbfbe0c-proxy-tls\") pod \"model-chainer-raw-0fa3d-65695c4586-f64qw\" (UID: \"55501623-1e85-497c-8e95-44fd6fbfbe0c\") " pod="kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw" Apr 16 15:06:28.962678 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:28.962637 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55501623-1e85-497c-8e95-44fd6fbfbe0c-openshift-service-ca-bundle\") pod \"model-chainer-raw-0fa3d-65695c4586-f64qw\" (UID: \"55501623-1e85-497c-8e95-44fd6fbfbe0c\") " pod="kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw" Apr 16 15:06:28.962852 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:28.962695 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55501623-1e85-497c-8e95-44fd6fbfbe0c-proxy-tls\") pod \"model-chainer-raw-0fa3d-65695c4586-f64qw\" (UID: \"55501623-1e85-497c-8e95-44fd6fbfbe0c\") " pod="kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw" Apr 16 15:06:28.962897 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:06:28.962867 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-0fa3d-serving-cert: secret "model-chainer-raw-0fa3d-serving-cert" not found Apr 16 15:06:28.962947 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:06:28.962936 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55501623-1e85-497c-8e95-44fd6fbfbe0c-proxy-tls podName:55501623-1e85-497c-8e95-44fd6fbfbe0c nodeName:}" failed. No retries permitted until 2026-04-16 15:06:29.462920588 +0000 UTC m=+853.804051426 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/55501623-1e85-497c-8e95-44fd6fbfbe0c-proxy-tls") pod "model-chainer-raw-0fa3d-65695c4586-f64qw" (UID: "55501623-1e85-497c-8e95-44fd6fbfbe0c") : secret "model-chainer-raw-0fa3d-serving-cert" not found Apr 16 15:06:28.963374 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:28.963357 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55501623-1e85-497c-8e95-44fd6fbfbe0c-openshift-service-ca-bundle\") pod \"model-chainer-raw-0fa3d-65695c4586-f64qw\" (UID: \"55501623-1e85-497c-8e95-44fd6fbfbe0c\") " pod="kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw" Apr 16 15:06:29.467188 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:29.467142 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55501623-1e85-497c-8e95-44fd6fbfbe0c-proxy-tls\") pod \"model-chainer-raw-0fa3d-65695c4586-f64qw\" (UID: \"55501623-1e85-497c-8e95-44fd6fbfbe0c\") " pod="kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw" Apr 16 15:06:29.469543 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:29.469525 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55501623-1e85-497c-8e95-44fd6fbfbe0c-proxy-tls\") pod \"model-chainer-raw-0fa3d-65695c4586-f64qw\" (UID: \"55501623-1e85-497c-8e95-44fd6fbfbe0c\") " pod="kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw" Apr 16 15:06:29.707484 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:29.707432 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw" Apr 16 15:06:29.827447 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:29.827420 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw"] Apr 16 15:06:29.829997 ip-10-0-142-46 kubenswrapper[2565]: W0416 15:06:29.829955 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55501623_1e85_497c_8e95_44fd6fbfbe0c.slice/crio-22f3b6c42a0e52e73f02e1046d4b522836bb939573138556b7f6e5fb4d55c8c9 WatchSource:0}: Error finding container 22f3b6c42a0e52e73f02e1046d4b522836bb939573138556b7f6e5fb4d55c8c9: Status 404 returned error can't find the container with id 22f3b6c42a0e52e73f02e1046d4b522836bb939573138556b7f6e5fb4d55c8c9 Apr 16 15:06:29.831874 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:29.831853 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:06:29.850628 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:29.850595 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw" event={"ID":"55501623-1e85-497c-8e95-44fd6fbfbe0c","Type":"ContainerStarted","Data":"22f3b6c42a0e52e73f02e1046d4b522836bb939573138556b7f6e5fb4d55c8c9"} Apr 16 15:06:32.859799 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:32.859762 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw" event={"ID":"55501623-1e85-497c-8e95-44fd6fbfbe0c","Type":"ContainerStarted","Data":"9bcbd01949ff55c6e63bcaebd51f5ec76d960f4f56f5d03645ab937935b48632"} Apr 16 15:06:32.860289 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:32.859875 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw" Apr 16 15:06:32.873782 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:32.873728 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw" podStartSLOduration=1.9615607769999999 podStartE2EDuration="4.873706223s" podCreationTimestamp="2026-04-16 15:06:28 +0000 UTC" firstStartedPulling="2026-04-16 15:06:29.831976115 +0000 UTC m=+854.173106948" lastFinishedPulling="2026-04-16 15:06:32.744121561 +0000 UTC m=+857.085252394" observedRunningTime="2026-04-16 15:06:32.872914855 +0000 UTC m=+857.214045709" watchObservedRunningTime="2026-04-16 15:06:32.873706223 +0000 UTC m=+857.214837078" Apr 16 15:06:38.868429 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:38.868393 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw"] Apr 16 15:06:38.868953 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:38.868656 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw" podUID="55501623-1e85-497c-8e95-44fd6fbfbe0c" containerName="model-chainer-raw-0fa3d" containerID="cri-o://9bcbd01949ff55c6e63bcaebd51f5ec76d960f4f56f5d03645ab937935b48632" gracePeriod=30 Apr 16 15:06:38.869391 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:38.869325 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw" Apr 16 15:06:38.874100 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:38.874072 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw" podUID="55501623-1e85-497c-8e95-44fd6fbfbe0c" containerName="model-chainer-raw-0fa3d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:06:39.019993 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:39.019955 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78"] Apr 16 15:06:39.020357 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:39.020273 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78" podUID="698398ad-669d-4a53-b8ce-c9e4d415cd4a" containerName="kserve-container" containerID="cri-o://a9ec1b206c8edd67442bbbf7df8ebeff7fb76421cd589608562ae33912eb182d" gracePeriod=30 Apr 16 15:06:39.057606 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:39.057572 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz"] Apr 16 15:06:39.060893 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:39.060871 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz" Apr 16 15:06:39.070083 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:39.070058 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz"] Apr 16 15:06:39.103298 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:39.103265 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k"] Apr 16 15:06:39.106464 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:39.106445 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k" Apr 16 15:06:39.115263 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:39.115234 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k"] Apr 16 15:06:39.147383 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:39.147347 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e700bf1-641c-42b2-92f0-b5a30ba68cf8-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz\" (UID: \"7e700bf1-641c-42b2-92f0-b5a30ba68cf8\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz" Apr 16 15:06:39.199780 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:39.199750 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k"] Apr 16 15:06:39.200019 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:39.199999 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k" podUID="01e75b68-a5d7-41a3-b185-020a24ed1293" containerName="kserve-container" containerID="cri-o://42317f9e7eacb8dddb7ad64a60da6e385a0eeab1496c311a09f072ad21441b89" gracePeriod=30 Apr 16 15:06:39.248497 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:39.248463 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8db24c2d-d0c3-4ebf-8d45-b9f0103a573c-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k\" (UID: \"8db24c2d-d0c3-4ebf-8d45-b9f0103a573c\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k" Apr 16 15:06:39.248665 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:39.248510 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e700bf1-641c-42b2-92f0-b5a30ba68cf8-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz\" (UID: \"7e700bf1-641c-42b2-92f0-b5a30ba68cf8\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz" Apr 16 15:06:39.248908 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:39.248887 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e700bf1-641c-42b2-92f0-b5a30ba68cf8-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz\" (UID: \"7e700bf1-641c-42b2-92f0-b5a30ba68cf8\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz" Apr 16 15:06:39.349462 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:39.349424 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8db24c2d-d0c3-4ebf-8d45-b9f0103a573c-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k\" (UID: \"8db24c2d-d0c3-4ebf-8d45-b9f0103a573c\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k" Apr 16 15:06:39.349843 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:39.349818 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8db24c2d-d0c3-4ebf-8d45-b9f0103a573c-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k\" (UID: \"8db24c2d-d0c3-4ebf-8d45-b9f0103a573c\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k" Apr 16 15:06:39.371371 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:39.371344 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz" Apr 16 15:06:39.416309 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:39.416223 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k" Apr 16 15:06:39.503558 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:39.503334 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz"] Apr 16 15:06:39.507483 ip-10-0-142-46 kubenswrapper[2565]: W0416 15:06:39.507435 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e700bf1_641c_42b2_92f0_b5a30ba68cf8.slice/crio-f2e19ef945a555c0dfa14a997d6739e2e086f65574bf9de2dd81f098ace5cc66 WatchSource:0}: Error finding container f2e19ef945a555c0dfa14a997d6739e2e086f65574bf9de2dd81f098ace5cc66: Status 404 returned error can't find the container with id f2e19ef945a555c0dfa14a997d6739e2e086f65574bf9de2dd81f098ace5cc66 Apr 16 15:06:39.553531 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:39.553503 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k"] Apr 16 15:06:39.555952 ip-10-0-142-46 kubenswrapper[2565]: W0416 15:06:39.555926 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8db24c2d_d0c3_4ebf_8d45_b9f0103a573c.slice/crio-65592565bbdccb9e6c763f45a775b2bc4404916fcce99c8e3ce608c6fb031d4f WatchSource:0}: Error finding container 65592565bbdccb9e6c763f45a775b2bc4404916fcce99c8e3ce608c6fb031d4f: Status 404 returned error can't find the container with id 65592565bbdccb9e6c763f45a775b2bc4404916fcce99c8e3ce608c6fb031d4f Apr 16 15:06:39.882457 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:39.882364 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz" event={"ID":"7e700bf1-641c-42b2-92f0-b5a30ba68cf8","Type":"ContainerStarted","Data":"4b29c49c8917fd7341d98fdd04b6a3c7dce9d31f9c2cc843b9a3fb372aaf98c0"} Apr 16 15:06:39.882457 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:39.882403 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz" event={"ID":"7e700bf1-641c-42b2-92f0-b5a30ba68cf8","Type":"ContainerStarted","Data":"f2e19ef945a555c0dfa14a997d6739e2e086f65574bf9de2dd81f098ace5cc66"} Apr 16 15:06:39.883864 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:39.883839 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k" event={"ID":"8db24c2d-d0c3-4ebf-8d45-b9f0103a573c","Type":"ContainerStarted","Data":"1e5f77e7ff34332d9950b91634f1765d046f8caec39b252622bed45efc16987a"} Apr 16 15:06:39.883864 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:39.883867 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k" event={"ID":"8db24c2d-d0c3-4ebf-8d45-b9f0103a573c","Type":"ContainerStarted","Data":"65592565bbdccb9e6c763f45a775b2bc4404916fcce99c8e3ce608c6fb031d4f"} Apr 16 15:06:43.036440 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.036415 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k" Apr 16 15:06:43.186239 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.186208 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01e75b68-a5d7-41a3-b185-020a24ed1293-kserve-provision-location\") pod \"01e75b68-a5d7-41a3-b185-020a24ed1293\" (UID: \"01e75b68-a5d7-41a3-b185-020a24ed1293\") " Apr 16 15:06:43.186586 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.186563 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01e75b68-a5d7-41a3-b185-020a24ed1293-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "01e75b68-a5d7-41a3-b185-020a24ed1293" (UID: "01e75b68-a5d7-41a3-b185-020a24ed1293"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:06:43.287706 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.287673 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01e75b68-a5d7-41a3-b185-020a24ed1293-kserve-provision-location\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 15:06:43.655391 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.655368 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78" Apr 16 15:06:43.791039 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.790938 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/698398ad-669d-4a53-b8ce-c9e4d415cd4a-kserve-provision-location\") pod \"698398ad-669d-4a53-b8ce-c9e4d415cd4a\" (UID: \"698398ad-669d-4a53-b8ce-c9e4d415cd4a\") " Apr 16 15:06:43.791349 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.791322 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/698398ad-669d-4a53-b8ce-c9e4d415cd4a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "698398ad-669d-4a53-b8ce-c9e4d415cd4a" (UID: "698398ad-669d-4a53-b8ce-c9e4d415cd4a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:06:43.873751 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.873708 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw" podUID="55501623-1e85-497c-8e95-44fd6fbfbe0c" containerName="model-chainer-raw-0fa3d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:06:43.892208 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.892162 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/698398ad-669d-4a53-b8ce-c9e4d415cd4a-kserve-provision-location\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 15:06:43.896509 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.896481 2565 generic.go:358] "Generic (PLEG): container finished" podID="01e75b68-a5d7-41a3-b185-020a24ed1293" containerID="42317f9e7eacb8dddb7ad64a60da6e385a0eeab1496c311a09f072ad21441b89" exitCode=0 Apr 16 15:06:43.896655 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.896547 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k" Apr 16 15:06:43.896655 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.896562 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k" event={"ID":"01e75b68-a5d7-41a3-b185-020a24ed1293","Type":"ContainerDied","Data":"42317f9e7eacb8dddb7ad64a60da6e385a0eeab1496c311a09f072ad21441b89"} Apr 16 15:06:43.896655 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.896609 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k" event={"ID":"01e75b68-a5d7-41a3-b185-020a24ed1293","Type":"ContainerDied","Data":"5b1a4675adbfce59286bb3de4528a296771f28edacdfbb58bba1bf6bec4cfcfd"} Apr 16 15:06:43.896655 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.896633 2565 scope.go:117] "RemoveContainer" containerID="42317f9e7eacb8dddb7ad64a60da6e385a0eeab1496c311a09f072ad21441b89" Apr 16 15:06:43.897990 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.897967 2565 generic.go:358] "Generic (PLEG): container finished" podID="698398ad-669d-4a53-b8ce-c9e4d415cd4a" containerID="a9ec1b206c8edd67442bbbf7df8ebeff7fb76421cd589608562ae33912eb182d" exitCode=0 Apr 16 15:06:43.898094 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.898027 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78" Apr 16 15:06:43.898094 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.898046 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78" event={"ID":"698398ad-669d-4a53-b8ce-c9e4d415cd4a","Type":"ContainerDied","Data":"a9ec1b206c8edd67442bbbf7df8ebeff7fb76421cd589608562ae33912eb182d"} Apr 16 15:06:43.898094 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.898091 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78" event={"ID":"698398ad-669d-4a53-b8ce-c9e4d415cd4a","Type":"ContainerDied","Data":"a7c6f007af9393d687ba4610a6142f099fc4d11d1bec00de37694b88dbf327a4"} Apr 16 15:06:43.899725 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.899694 2565 generic.go:358] "Generic (PLEG): container finished" podID="7e700bf1-641c-42b2-92f0-b5a30ba68cf8" containerID="4b29c49c8917fd7341d98fdd04b6a3c7dce9d31f9c2cc843b9a3fb372aaf98c0" exitCode=0 Apr 16 15:06:43.899826 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.899726 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz" event={"ID":"7e700bf1-641c-42b2-92f0-b5a30ba68cf8","Type":"ContainerDied","Data":"4b29c49c8917fd7341d98fdd04b6a3c7dce9d31f9c2cc843b9a3fb372aaf98c0"} Apr 16 15:06:43.901316 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.901294 2565 generic.go:358] "Generic (PLEG): container finished" podID="8db24c2d-d0c3-4ebf-8d45-b9f0103a573c" containerID="1e5f77e7ff34332d9950b91634f1765d046f8caec39b252622bed45efc16987a" exitCode=0 Apr 16 15:06:43.901424 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.901350 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k" event={"ID":"8db24c2d-d0c3-4ebf-8d45-b9f0103a573c","Type":"ContainerDied","Data":"1e5f77e7ff34332d9950b91634f1765d046f8caec39b252622bed45efc16987a"} Apr 16 15:06:43.906732 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.906709 2565 scope.go:117] "RemoveContainer" containerID="b120d09be596f892029fda287e406981fbc7b6671d45e29b43e9c2945618cbcd" Apr 16 15:06:43.915209 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.915112 2565 scope.go:117] "RemoveContainer" containerID="42317f9e7eacb8dddb7ad64a60da6e385a0eeab1496c311a09f072ad21441b89" Apr 16 15:06:43.915594 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:06:43.915496 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42317f9e7eacb8dddb7ad64a60da6e385a0eeab1496c311a09f072ad21441b89\": container with ID starting with 42317f9e7eacb8dddb7ad64a60da6e385a0eeab1496c311a09f072ad21441b89 not found: ID does not exist" containerID="42317f9e7eacb8dddb7ad64a60da6e385a0eeab1496c311a09f072ad21441b89" Apr 16 15:06:43.915594 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.915535 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42317f9e7eacb8dddb7ad64a60da6e385a0eeab1496c311a09f072ad21441b89"} err="failed to get container status \"42317f9e7eacb8dddb7ad64a60da6e385a0eeab1496c311a09f072ad21441b89\": rpc error: code = NotFound desc = could not find container \"42317f9e7eacb8dddb7ad64a60da6e385a0eeab1496c311a09f072ad21441b89\": container with ID starting with 42317f9e7eacb8dddb7ad64a60da6e385a0eeab1496c311a09f072ad21441b89 not found: ID does not exist" Apr 16 15:06:43.915594 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.915567 2565 scope.go:117] "RemoveContainer" containerID="b120d09be596f892029fda287e406981fbc7b6671d45e29b43e9c2945618cbcd" Apr 16 15:06:43.916707 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:06:43.915897 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b120d09be596f892029fda287e406981fbc7b6671d45e29b43e9c2945618cbcd\": container with ID starting with b120d09be596f892029fda287e406981fbc7b6671d45e29b43e9c2945618cbcd not found: ID does not exist" containerID="b120d09be596f892029fda287e406981fbc7b6671d45e29b43e9c2945618cbcd" Apr 16 15:06:43.916707 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.915929 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b120d09be596f892029fda287e406981fbc7b6671d45e29b43e9c2945618cbcd"} err="failed to get container status \"b120d09be596f892029fda287e406981fbc7b6671d45e29b43e9c2945618cbcd\": rpc error: code = NotFound desc = could not find container \"b120d09be596f892029fda287e406981fbc7b6671d45e29b43e9c2945618cbcd\": container with ID starting with b120d09be596f892029fda287e406981fbc7b6671d45e29b43e9c2945618cbcd not found: ID does not exist" Apr 16 15:06:43.916707 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.915951 2565 scope.go:117] "RemoveContainer" containerID="a9ec1b206c8edd67442bbbf7df8ebeff7fb76421cd589608562ae33912eb182d" Apr 16 15:06:43.930200 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.930156 2565 scope.go:117] "RemoveContainer" containerID="93c0f1d7481469399613795f4ea7aa03b9a1adb04d6d4e2873610552508ca75a" Apr 16 15:06:43.940952 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.940814 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78"] Apr 16 15:06:43.941018 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.940989 2565 scope.go:117] "RemoveContainer" containerID="a9ec1b206c8edd67442bbbf7df8ebeff7fb76421cd589608562ae33912eb182d" Apr 16 15:06:43.941302 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:06:43.941283 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9ec1b206c8edd67442bbbf7df8ebeff7fb76421cd589608562ae33912eb182d\": container with ID starting with a9ec1b206c8edd67442bbbf7df8ebeff7fb76421cd589608562ae33912eb182d not found: ID does not exist" containerID="a9ec1b206c8edd67442bbbf7df8ebeff7fb76421cd589608562ae33912eb182d" Apr 16 15:06:43.941356 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.941310 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9ec1b206c8edd67442bbbf7df8ebeff7fb76421cd589608562ae33912eb182d"} err="failed to get container status \"a9ec1b206c8edd67442bbbf7df8ebeff7fb76421cd589608562ae33912eb182d\": rpc error: code = NotFound desc = could not find container \"a9ec1b206c8edd67442bbbf7df8ebeff7fb76421cd589608562ae33912eb182d\": container with ID starting with a9ec1b206c8edd67442bbbf7df8ebeff7fb76421cd589608562ae33912eb182d not found: ID does not exist" Apr 16 15:06:43.941356 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.941330 2565 scope.go:117] "RemoveContainer" containerID="93c0f1d7481469399613795f4ea7aa03b9a1adb04d6d4e2873610552508ca75a" Apr 16 15:06:43.941580 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:06:43.941561 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93c0f1d7481469399613795f4ea7aa03b9a1adb04d6d4e2873610552508ca75a\": container with ID starting with 93c0f1d7481469399613795f4ea7aa03b9a1adb04d6d4e2873610552508ca75a not found: ID does not exist" containerID="93c0f1d7481469399613795f4ea7aa03b9a1adb04d6d4e2873610552508ca75a" Apr 16 15:06:43.941633 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.941590 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93c0f1d7481469399613795f4ea7aa03b9a1adb04d6d4e2873610552508ca75a"} err="failed to get container status \"93c0f1d7481469399613795f4ea7aa03b9a1adb04d6d4e2873610552508ca75a\": rpc error: code = NotFound desc = could not find container \"93c0f1d7481469399613795f4ea7aa03b9a1adb04d6d4e2873610552508ca75a\": container with ID starting with 93c0f1d7481469399613795f4ea7aa03b9a1adb04d6d4e2873610552508ca75a not found: ID does not exist" Apr 16 15:06:43.948200 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.945450 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-0fa3d-predictor-5ccf48bd8d-jdt78"] Apr 16 15:06:43.958228 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.958196 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k"] Apr 16 15:06:43.961667 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:43.961644 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-0fa3d-predictor-74f94bbd95-z2d7k"] Apr 16 15:06:44.327102 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:44.327066 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01e75b68-a5d7-41a3-b185-020a24ed1293" path="/var/lib/kubelet/pods/01e75b68-a5d7-41a3-b185-020a24ed1293/volumes" Apr 16 15:06:44.327484 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:44.327464 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="698398ad-669d-4a53-b8ce-c9e4d415cd4a" path="/var/lib/kubelet/pods/698398ad-669d-4a53-b8ce-c9e4d415cd4a/volumes" Apr 16 15:06:44.906764 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:44.906727 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz" event={"ID":"7e700bf1-641c-42b2-92f0-b5a30ba68cf8","Type":"ContainerStarted","Data":"cda11af558f3319df248e49a375cd3fb6ea04871a2e7899a3275611301d0cafc"} Apr 16 15:06:44.907261 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:44.907227 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz" Apr 16 15:06:44.908484 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:44.908447 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz" podUID="7e700bf1-641c-42b2-92f0-b5a30ba68cf8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 15:06:44.908925 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:44.908897 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k" event={"ID":"8db24c2d-d0c3-4ebf-8d45-b9f0103a573c","Type":"ContainerStarted","Data":"331cded4fc1e38e7d233f71c0122ec171b4173dda76dd85367a725b85a84d99e"} Apr 16 15:06:44.909227 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:44.909205 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k" Apr 16 15:06:44.910231 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:44.910206 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k" podUID="8db24c2d-d0c3-4ebf-8d45-b9f0103a573c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 15:06:44.923015 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:44.922965 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz" podStartSLOduration=5.922948627 podStartE2EDuration="5.922948627s" podCreationTimestamp="2026-04-16 15:06:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:06:44.921504035 +0000 UTC m=+869.262634891" watchObservedRunningTime="2026-04-16 15:06:44.922948627 +0000 UTC m=+869.264079483" Apr 16 15:06:44.936860 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:44.936799 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k" podStartSLOduration=5.936781024 podStartE2EDuration="5.936781024s" podCreationTimestamp="2026-04-16 15:06:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:06:44.935416857 +0000 UTC m=+869.276547711" watchObservedRunningTime="2026-04-16 15:06:44.936781024 +0000 UTC m=+869.277911878" Apr 16 15:06:45.915383 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:45.915343 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k" podUID="8db24c2d-d0c3-4ebf-8d45-b9f0103a573c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 15:06:45.915787 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:45.915353 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz" podUID="7e700bf1-641c-42b2-92f0-b5a30ba68cf8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 15:06:48.878105 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:48.878062 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw" podUID="55501623-1e85-497c-8e95-44fd6fbfbe0c" containerName="model-chainer-raw-0fa3d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:06:48.878501 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:48.878213 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw" Apr 16 15:06:53.873764 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:53.873665 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw" podUID="55501623-1e85-497c-8e95-44fd6fbfbe0c" containerName="model-chainer-raw-0fa3d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:06:55.916049 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:55.915997 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz" podUID="7e700bf1-641c-42b2-92f0-b5a30ba68cf8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 15:06:55.916531 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:55.915997 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k" podUID="8db24c2d-d0c3-4ebf-8d45-b9f0103a573c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 15:06:58.873159 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:06:58.873120 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw" podUID="55501623-1e85-497c-8e95-44fd6fbfbe0c" containerName="model-chainer-raw-0fa3d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:07:03.873577 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:03.873535 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw" podUID="55501623-1e85-497c-8e95-44fd6fbfbe0c" containerName="model-chainer-raw-0fa3d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:07:05.915554 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:05.915470 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz" podUID="7e700bf1-641c-42b2-92f0-b5a30ba68cf8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 15:07:05.915892 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:05.915471 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k" podUID="8db24c2d-d0c3-4ebf-8d45-b9f0103a573c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 15:07:08.874018 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:08.873981 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw" podUID="55501623-1e85-497c-8e95-44fd6fbfbe0c" containerName="model-chainer-raw-0fa3d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:07:08.985682 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:08.985644 2565 generic.go:358] "Generic (PLEG): container finished" podID="55501623-1e85-497c-8e95-44fd6fbfbe0c" containerID="9bcbd01949ff55c6e63bcaebd51f5ec76d960f4f56f5d03645ab937935b48632" exitCode=137 Apr 16 15:07:08.985851 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:08.985721 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw" event={"ID":"55501623-1e85-497c-8e95-44fd6fbfbe0c","Type":"ContainerDied","Data":"9bcbd01949ff55c6e63bcaebd51f5ec76d960f4f56f5d03645ab937935b48632"} Apr 16 15:07:09.503568 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:09.503546 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw" Apr 16 15:07:09.607685 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:09.607640 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55501623-1e85-497c-8e95-44fd6fbfbe0c-proxy-tls\") pod \"55501623-1e85-497c-8e95-44fd6fbfbe0c\" (UID: \"55501623-1e85-497c-8e95-44fd6fbfbe0c\") " Apr 16 15:07:09.607875 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:09.607727 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55501623-1e85-497c-8e95-44fd6fbfbe0c-openshift-service-ca-bundle\") pod \"55501623-1e85-497c-8e95-44fd6fbfbe0c\" (UID: \"55501623-1e85-497c-8e95-44fd6fbfbe0c\") " Apr 16 15:07:09.608116 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:09.608081 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55501623-1e85-497c-8e95-44fd6fbfbe0c-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "55501623-1e85-497c-8e95-44fd6fbfbe0c" (UID: "55501623-1e85-497c-8e95-44fd6fbfbe0c"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:07:09.609834 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:09.609803 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55501623-1e85-497c-8e95-44fd6fbfbe0c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "55501623-1e85-497c-8e95-44fd6fbfbe0c" (UID: "55501623-1e85-497c-8e95-44fd6fbfbe0c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:07:09.708440 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:09.708395 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55501623-1e85-497c-8e95-44fd6fbfbe0c-proxy-tls\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 15:07:09.708440 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:09.708424 2565 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55501623-1e85-497c-8e95-44fd6fbfbe0c-openshift-service-ca-bundle\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 15:07:09.990194 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:09.990083 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw" event={"ID":"55501623-1e85-497c-8e95-44fd6fbfbe0c","Type":"ContainerDied","Data":"22f3b6c42a0e52e73f02e1046d4b522836bb939573138556b7f6e5fb4d55c8c9"} Apr 16 15:07:09.990194 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:09.990118 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw" Apr 16 15:07:09.990194 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:09.990134 2565 scope.go:117] "RemoveContainer" containerID="9bcbd01949ff55c6e63bcaebd51f5ec76d960f4f56f5d03645ab937935b48632" Apr 16 15:07:10.010343 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:10.010301 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw"] Apr 16 15:07:10.011978 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:10.011956 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-0fa3d-65695c4586-f64qw"] Apr 16 15:07:10.326681 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:10.326588 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55501623-1e85-497c-8e95-44fd6fbfbe0c" path="/var/lib/kubelet/pods/55501623-1e85-497c-8e95-44fd6fbfbe0c/volumes" Apr 16 15:07:15.915677 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:15.915636 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k" podUID="8db24c2d-d0c3-4ebf-8d45-b9f0103a573c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 15:07:15.916068 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:15.915644 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz" podUID="7e700bf1-641c-42b2-92f0-b5a30ba68cf8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 15:07:16.261842 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:16.261754 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4krl_84ace2a9-8bcc-47b5-81bb-c764aa280104/ovn-acl-logging/0.log" Apr 16 15:07:16.262829 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:16.262806 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4krl_84ace2a9-8bcc-47b5-81bb-c764aa280104/ovn-acl-logging/0.log" Apr 16 15:07:25.915642 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:25.915593 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k" podUID="8db24c2d-d0c3-4ebf-8d45-b9f0103a573c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 15:07:25.916090 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:25.915593 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz" podUID="7e700bf1-641c-42b2-92f0-b5a30ba68cf8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 15:07:35.916186 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:35.916133 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k" podUID="8db24c2d-d0c3-4ebf-8d45-b9f0103a573c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 15:07:35.916605 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:35.916133 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz" podUID="7e700bf1-641c-42b2-92f0-b5a30ba68cf8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 15:07:45.915566 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:45.915517 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz" podUID="7e700bf1-641c-42b2-92f0-b5a30ba68cf8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 15:07:45.916051 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:45.916031 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k" Apr 16 15:07:55.917184 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:07:55.917136 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz" Apr 16 15:08:09.093243 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:09.093204 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6"] Apr 16 15:08:09.093830 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:09.093675 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="698398ad-669d-4a53-b8ce-c9e4d415cd4a" containerName="storage-initializer" Apr 16 15:08:09.093830 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:09.093694 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="698398ad-669d-4a53-b8ce-c9e4d415cd4a" containerName="storage-initializer" Apr 16 15:08:09.093830 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:09.093712 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01e75b68-a5d7-41a3-b185-020a24ed1293" containerName="storage-initializer" Apr 16 15:08:09.093830 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:09.093720 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e75b68-a5d7-41a3-b185-020a24ed1293" containerName="storage-initializer" Apr 16 15:08:09.093830 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:09.093741 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55501623-1e85-497c-8e95-44fd6fbfbe0c" containerName="model-chainer-raw-0fa3d" Apr 16 15:08:09.093830 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:09.093749 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="55501623-1e85-497c-8e95-44fd6fbfbe0c" containerName="model-chainer-raw-0fa3d" Apr 16 15:08:09.093830 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:09.093765 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="698398ad-669d-4a53-b8ce-c9e4d415cd4a" containerName="kserve-container" Apr 16 15:08:09.093830 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:09.093774 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="698398ad-669d-4a53-b8ce-c9e4d415cd4a" containerName="kserve-container" Apr 16 15:08:09.093830 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:09.093785 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01e75b68-a5d7-41a3-b185-020a24ed1293" containerName="kserve-container" Apr 16 15:08:09.093830 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:09.093793 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e75b68-a5d7-41a3-b185-020a24ed1293" containerName="kserve-container" Apr 16 15:08:09.094318 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:09.093863 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="01e75b68-a5d7-41a3-b185-020a24ed1293" containerName="kserve-container" Apr 16 15:08:09.094318 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:09.093879 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="698398ad-669d-4a53-b8ce-c9e4d415cd4a" containerName="kserve-container" Apr 16 15:08:09.094318 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:09.093890 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="55501623-1e85-497c-8e95-44fd6fbfbe0c" containerName="model-chainer-raw-0fa3d" Apr 16 15:08:09.096201 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:09.096160 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6" Apr 16 15:08:09.098227 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:09.098205 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-007f8-serving-cert\"" Apr 16 15:08:09.098334 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:09.098208 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-007f8-kube-rbac-proxy-sar-config\"" Apr 16 15:08:09.098334 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:09.098320 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 15:08:09.104675 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:09.104646 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6"] Apr 16 15:08:09.184611 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:09.184574 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f0e4d64-6552-422e-b759-ef7870748314-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6\" (UID: \"8f0e4d64-6552-422e-b759-ef7870748314\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6" Apr 16 15:08:09.184791 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:09.184632 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f0e4d64-6552-422e-b759-ef7870748314-proxy-tls\") pod \"model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6\" (UID: \"8f0e4d64-6552-422e-b759-ef7870748314\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6" Apr 16 15:08:09.285386 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:09.285350 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f0e4d64-6552-422e-b759-ef7870748314-proxy-tls\") pod \"model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6\" (UID: \"8f0e4d64-6552-422e-b759-ef7870748314\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6" Apr 16 15:08:09.285566 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:09.285523 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f0e4d64-6552-422e-b759-ef7870748314-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6\" (UID: \"8f0e4d64-6552-422e-b759-ef7870748314\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6" Apr 16 15:08:09.286114 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:09.286090 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f0e4d64-6552-422e-b759-ef7870748314-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6\" (UID: \"8f0e4d64-6552-422e-b759-ef7870748314\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6" Apr 16 15:08:09.287733 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:09.287706 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f0e4d64-6552-422e-b759-ef7870748314-proxy-tls\") pod \"model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6\" (UID: \"8f0e4d64-6552-422e-b759-ef7870748314\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6" Apr 16 15:08:09.406792 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:09.406756 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6" Apr 16 15:08:09.527514 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:09.527369 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6"] Apr 16 15:08:09.530269 ip-10-0-142-46 kubenswrapper[2565]: W0416 15:08:09.530241 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f0e4d64_6552_422e_b759_ef7870748314.slice/crio-8983836702f1b00e5ba9584b2a5e565002c808c862ac2991af84745b35d5ee44 WatchSource:0}: Error finding container 8983836702f1b00e5ba9584b2a5e565002c808c862ac2991af84745b35d5ee44: Status 404 returned error can't find the container with id 8983836702f1b00e5ba9584b2a5e565002c808c862ac2991af84745b35d5ee44 Apr 16 15:08:10.166748 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:10.166715 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6" event={"ID":"8f0e4d64-6552-422e-b759-ef7870748314","Type":"ContainerStarted","Data":"9bf11496e1bc7479b75c3512d65ab0221086228d900fea852115edb6f4694c60"} Apr 16 15:08:10.166748 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:10.166750 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6" event={"ID":"8f0e4d64-6552-422e-b759-ef7870748314","Type":"ContainerStarted","Data":"8983836702f1b00e5ba9584b2a5e565002c808c862ac2991af84745b35d5ee44"} Apr 16 15:08:10.167226 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:10.166837 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6" Apr 16 15:08:10.182336 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:10.182290 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6" podStartSLOduration=1.182275535 podStartE2EDuration="1.182275535s" podCreationTimestamp="2026-04-16 15:08:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:08:10.181712804 +0000 UTC m=+954.522843670" watchObservedRunningTime="2026-04-16 15:08:10.182275535 +0000 UTC m=+954.523406390" Apr 16 15:08:16.177696 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:16.177669 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6" Apr 16 15:08:19.161448 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:19.161411 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6"] Apr 16 15:08:19.161857 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:19.161638 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6" podUID="8f0e4d64-6552-422e-b759-ef7870748314" containerName="model-chainer-raw-hpa-007f8" containerID="cri-o://9bf11496e1bc7479b75c3512d65ab0221086228d900fea852115edb6f4694c60" gracePeriod=30 Apr 16 15:08:19.312043 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:19.312001 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz"] Apr 16 15:08:19.312385 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:19.312342 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz" podUID="7e700bf1-641c-42b2-92f0-b5a30ba68cf8" containerName="kserve-container" containerID="cri-o://cda11af558f3319df248e49a375cd3fb6ea04871a2e7899a3275611301d0cafc" gracePeriod=30 Apr 16 15:08:19.329333 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:19.329306 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-116da-predictor-6578d49f9-hhb9v"] Apr 16 15:08:19.331628 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:19.331613 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-116da-predictor-6578d49f9-hhb9v" Apr 16 15:08:19.340025 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:19.340002 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-116da-predictor-6578d49f9-hhb9v"] Apr 16 15:08:19.341370 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:19.341350 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-116da-predictor-6578d49f9-hhb9v" Apr 16 15:08:19.404227 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:19.404112 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k"] Apr 16 15:08:19.404878 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:19.404476 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k" podUID="8db24c2d-d0c3-4ebf-8d45-b9f0103a573c" containerName="kserve-container" containerID="cri-o://331cded4fc1e38e7d233f71c0122ec171b4173dda76dd85367a725b85a84d99e" gracePeriod=30 Apr 16 15:08:19.471072 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:19.471043 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-116da-predictor-6578d49f9-hhb9v"] Apr 16 15:08:19.472878 ip-10-0-142-46 kubenswrapper[2565]: W0416 15:08:19.472837 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddceed32f_55e0_4b72_bdda_061768e385ff.slice/crio-0ca95581cd2bf3b056d8e439cd5b86249a5d6abfe3297e4042dd8105d7f849d1 WatchSource:0}: Error finding container 0ca95581cd2bf3b056d8e439cd5b86249a5d6abfe3297e4042dd8105d7f849d1: Status 404 returned error can't find the container with id 0ca95581cd2bf3b056d8e439cd5b86249a5d6abfe3297e4042dd8105d7f849d1 Apr 16 15:08:20.205828 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:20.205785 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-116da-predictor-6578d49f9-hhb9v" event={"ID":"dceed32f-55e0-4b72-bdda-061768e385ff","Type":"ContainerStarted","Data":"0ca95581cd2bf3b056d8e439cd5b86249a5d6abfe3297e4042dd8105d7f849d1"} Apr 16 15:08:21.176375 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:21.176329 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6" podUID="8f0e4d64-6552-422e-b759-ef7870748314" containerName="model-chainer-raw-hpa-007f8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:08:21.213323 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:21.213283 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-116da-predictor-6578d49f9-hhb9v" event={"ID":"dceed32f-55e0-4b72-bdda-061768e385ff","Type":"ContainerStarted","Data":"9653863dcb858032daf6cbb46ec36d2e6560bd8301a692b3c1d5c8211cd469eb"} Apr 16 15:08:21.214000 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:21.213578 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-116da-predictor-6578d49f9-hhb9v" Apr 16 15:08:21.215129 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:21.215108 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-116da-predictor-6578d49f9-hhb9v" Apr 16 15:08:21.227510 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:21.227463 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-116da-predictor-6578d49f9-hhb9v" podStartSLOduration=1.100063877 podStartE2EDuration="2.227448472s" podCreationTimestamp="2026-04-16 15:08:19 +0000 UTC" firstStartedPulling="2026-04-16 15:08:19.474939125 +0000 UTC m=+963.816069957" lastFinishedPulling="2026-04-16 15:08:20.602323702 +0000 UTC m=+964.943454552" observedRunningTime="2026-04-16 15:08:21.226781106 +0000 UTC m=+965.567911962" watchObservedRunningTime="2026-04-16 15:08:21.227448472 +0000 UTC m=+965.568579328" Apr 16 15:08:23.223686 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:23.223654 2565 generic.go:358] "Generic (PLEG): container finished" podID="8db24c2d-d0c3-4ebf-8d45-b9f0103a573c" containerID="331cded4fc1e38e7d233f71c0122ec171b4173dda76dd85367a725b85a84d99e" exitCode=0 Apr 16 15:08:23.224065 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:23.223726 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k" event={"ID":"8db24c2d-d0c3-4ebf-8d45-b9f0103a573c","Type":"ContainerDied","Data":"331cded4fc1e38e7d233f71c0122ec171b4173dda76dd85367a725b85a84d99e"} Apr 16 15:08:23.248001 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:23.247980 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k" Apr 16 15:08:23.298631 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:23.298534 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8db24c2d-d0c3-4ebf-8d45-b9f0103a573c-kserve-provision-location\") pod \"8db24c2d-d0c3-4ebf-8d45-b9f0103a573c\" (UID: \"8db24c2d-d0c3-4ebf-8d45-b9f0103a573c\") " Apr 16 15:08:23.298890 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:23.298866 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8db24c2d-d0c3-4ebf-8d45-b9f0103a573c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8db24c2d-d0c3-4ebf-8d45-b9f0103a573c" (UID: "8db24c2d-d0c3-4ebf-8d45-b9f0103a573c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:08:23.399483 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:23.399438 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8db24c2d-d0c3-4ebf-8d45-b9f0103a573c-kserve-provision-location\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 15:08:23.948853 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:23.948829 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz" Apr 16 15:08:24.005354 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:24.005315 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e700bf1-641c-42b2-92f0-b5a30ba68cf8-kserve-provision-location\") pod \"7e700bf1-641c-42b2-92f0-b5a30ba68cf8\" (UID: \"7e700bf1-641c-42b2-92f0-b5a30ba68cf8\") " Apr 16 15:08:24.005656 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:24.005631 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e700bf1-641c-42b2-92f0-b5a30ba68cf8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7e700bf1-641c-42b2-92f0-b5a30ba68cf8" (UID: "7e700bf1-641c-42b2-92f0-b5a30ba68cf8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:08:24.106849 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:24.106753 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e700bf1-641c-42b2-92f0-b5a30ba68cf8-kserve-provision-location\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 15:08:24.228523 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:24.228490 2565 generic.go:358] "Generic (PLEG): container finished" podID="7e700bf1-641c-42b2-92f0-b5a30ba68cf8" containerID="cda11af558f3319df248e49a375cd3fb6ea04871a2e7899a3275611301d0cafc" exitCode=0 Apr 16 15:08:24.228988 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:24.228560 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz" Apr 16 15:08:24.228988 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:24.228582 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz" event={"ID":"7e700bf1-641c-42b2-92f0-b5a30ba68cf8","Type":"ContainerDied","Data":"cda11af558f3319df248e49a375cd3fb6ea04871a2e7899a3275611301d0cafc"} Apr 16 15:08:24.228988 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:24.228619 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz" event={"ID":"7e700bf1-641c-42b2-92f0-b5a30ba68cf8","Type":"ContainerDied","Data":"f2e19ef945a555c0dfa14a997d6739e2e086f65574bf9de2dd81f098ace5cc66"} Apr 16 15:08:24.228988 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:24.228640 2565 scope.go:117] "RemoveContainer" containerID="cda11af558f3319df248e49a375cd3fb6ea04871a2e7899a3275611301d0cafc" Apr 16 15:08:24.230062 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:24.230043 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k" event={"ID":"8db24c2d-d0c3-4ebf-8d45-b9f0103a573c","Type":"ContainerDied","Data":"65592565bbdccb9e6c763f45a775b2bc4404916fcce99c8e3ce608c6fb031d4f"} Apr 16 15:08:24.230138 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:24.230105 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k" Apr 16 15:08:24.236330 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:24.236302 2565 scope.go:117] "RemoveContainer" containerID="4b29c49c8917fd7341d98fdd04b6a3c7dce9d31f9c2cc843b9a3fb372aaf98c0" Apr 16 15:08:24.244143 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:24.244126 2565 scope.go:117] "RemoveContainer" containerID="cda11af558f3319df248e49a375cd3fb6ea04871a2e7899a3275611301d0cafc" Apr 16 15:08:24.244439 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:08:24.244419 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cda11af558f3319df248e49a375cd3fb6ea04871a2e7899a3275611301d0cafc\": container with ID starting with cda11af558f3319df248e49a375cd3fb6ea04871a2e7899a3275611301d0cafc not found: ID does not exist" containerID="cda11af558f3319df248e49a375cd3fb6ea04871a2e7899a3275611301d0cafc" Apr 16 15:08:24.244506 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:24.244448 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cda11af558f3319df248e49a375cd3fb6ea04871a2e7899a3275611301d0cafc"} err="failed to get container status \"cda11af558f3319df248e49a375cd3fb6ea04871a2e7899a3275611301d0cafc\": rpc error: code = NotFound desc = could not find container \"cda11af558f3319df248e49a375cd3fb6ea04871a2e7899a3275611301d0cafc\": container with ID starting with cda11af558f3319df248e49a375cd3fb6ea04871a2e7899a3275611301d0cafc not found: ID does not exist" Apr 16 15:08:24.244506 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:24.244470 2565 scope.go:117] "RemoveContainer" containerID="4b29c49c8917fd7341d98fdd04b6a3c7dce9d31f9c2cc843b9a3fb372aaf98c0" Apr 16 15:08:24.244692 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:08:24.244675 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b29c49c8917fd7341d98fdd04b6a3c7dce9d31f9c2cc843b9a3fb372aaf98c0\": container with ID starting with 4b29c49c8917fd7341d98fdd04b6a3c7dce9d31f9c2cc843b9a3fb372aaf98c0 not found: ID does not exist" containerID="4b29c49c8917fd7341d98fdd04b6a3c7dce9d31f9c2cc843b9a3fb372aaf98c0" Apr 16 15:08:24.244746 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:24.244696 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b29c49c8917fd7341d98fdd04b6a3c7dce9d31f9c2cc843b9a3fb372aaf98c0"} err="failed to get container status \"4b29c49c8917fd7341d98fdd04b6a3c7dce9d31f9c2cc843b9a3fb372aaf98c0\": rpc error: code = NotFound desc = could not find container \"4b29c49c8917fd7341d98fdd04b6a3c7dce9d31f9c2cc843b9a3fb372aaf98c0\": container with ID starting with 4b29c49c8917fd7341d98fdd04b6a3c7dce9d31f9c2cc843b9a3fb372aaf98c0 not found: ID does not exist" Apr 16 15:08:24.244746 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:24.244710 2565 scope.go:117] "RemoveContainer" containerID="331cded4fc1e38e7d233f71c0122ec171b4173dda76dd85367a725b85a84d99e" Apr 16 15:08:24.252085 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:24.252060 2565 scope.go:117] "RemoveContainer" containerID="1e5f77e7ff34332d9950b91634f1765d046f8caec39b252622bed45efc16987a" Apr 16 15:08:24.253751 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:24.253731 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz"] Apr 16 15:08:24.256195 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:24.256161 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-007f8-predictor-578bc4d7bd-7hkkz"] Apr 16 15:08:24.265354 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:24.265330 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k"] Apr 16 15:08:24.267094 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:24.267072 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-007f8-predictor-775bddbb67-lhs6k"] Apr 16 15:08:24.326833 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:24.326800 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e700bf1-641c-42b2-92f0-b5a30ba68cf8" path="/var/lib/kubelet/pods/7e700bf1-641c-42b2-92f0-b5a30ba68cf8/volumes" Apr 16 15:08:24.327257 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:24.327238 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8db24c2d-d0c3-4ebf-8d45-b9f0103a573c" path="/var/lib/kubelet/pods/8db24c2d-d0c3-4ebf-8d45-b9f0103a573c/volumes" Apr 16 15:08:26.176323 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:26.176277 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6" podUID="8f0e4d64-6552-422e-b759-ef7870748314" containerName="model-chainer-raw-hpa-007f8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:08:29.361869 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:29.361820 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv"] Apr 16 15:08:29.362356 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:29.362341 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e700bf1-641c-42b2-92f0-b5a30ba68cf8" containerName="kserve-container" Apr 16 15:08:29.362419 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:29.362359 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e700bf1-641c-42b2-92f0-b5a30ba68cf8" containerName="kserve-container" Apr 16 15:08:29.362419 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:29.362383 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8db24c2d-d0c3-4ebf-8d45-b9f0103a573c" containerName="storage-initializer" Apr 16 15:08:29.362419 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:29.362392 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db24c2d-d0c3-4ebf-8d45-b9f0103a573c" containerName="storage-initializer" Apr 16 15:08:29.362419 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:29.362402 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e700bf1-641c-42b2-92f0-b5a30ba68cf8" containerName="storage-initializer" Apr 16 15:08:29.362419 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:29.362411 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e700bf1-641c-42b2-92f0-b5a30ba68cf8" containerName="storage-initializer" Apr 16 15:08:29.362658 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:29.362434 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8db24c2d-d0c3-4ebf-8d45-b9f0103a573c" containerName="kserve-container" Apr 16 15:08:29.362658 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:29.362445 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db24c2d-d0c3-4ebf-8d45-b9f0103a573c" containerName="kserve-container" Apr 16 15:08:29.362658 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:29.362521 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e700bf1-641c-42b2-92f0-b5a30ba68cf8" containerName="kserve-container" Apr 16 15:08:29.362658 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:29.362532 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="8db24c2d-d0c3-4ebf-8d45-b9f0103a573c" containerName="kserve-container" Apr 16 15:08:29.365924 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:29.365903 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" Apr 16 15:08:29.372870 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:29.372845 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv"] Apr 16 15:08:29.449798 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:29.449751 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d2d2964-af79-4836-958f-bed6d6dd4a47-kserve-provision-location\") pod \"isvc-logger-raw-116da-predictor-679df64688-9nfhv\" (UID: \"5d2d2964-af79-4836-958f-bed6d6dd4a47\") " pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" Apr 16 15:08:29.551165 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:29.551124 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d2d2964-af79-4836-958f-bed6d6dd4a47-kserve-provision-location\") pod \"isvc-logger-raw-116da-predictor-679df64688-9nfhv\" (UID: \"5d2d2964-af79-4836-958f-bed6d6dd4a47\") " pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" Apr 16 15:08:29.551528 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:29.551508 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d2d2964-af79-4836-958f-bed6d6dd4a47-kserve-provision-location\") pod \"isvc-logger-raw-116da-predictor-679df64688-9nfhv\" (UID: \"5d2d2964-af79-4836-958f-bed6d6dd4a47\") " pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" Apr 16 15:08:29.676381 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:29.676350 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" Apr 16 15:08:29.793513 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:29.793487 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv"] Apr 16 15:08:29.795713 ip-10-0-142-46 kubenswrapper[2565]: W0416 15:08:29.795685 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d2d2964_af79_4836_958f_bed6d6dd4a47.slice/crio-df5d94f25acb78aa04f6c2efc1a8cdd8272eb6b4c0f7875bdeba0149289f751a WatchSource:0}: Error finding container df5d94f25acb78aa04f6c2efc1a8cdd8272eb6b4c0f7875bdeba0149289f751a: Status 404 returned error can't find the container with id df5d94f25acb78aa04f6c2efc1a8cdd8272eb6b4c0f7875bdeba0149289f751a Apr 16 15:08:30.252108 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:30.252067 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" event={"ID":"5d2d2964-af79-4836-958f-bed6d6dd4a47","Type":"ContainerStarted","Data":"8dd23bc1d466f0cec0aaabbf74342e4a713391a4f117e3584e8810b5c2d9e27e"} Apr 16 15:08:30.252108 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:30.252110 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" event={"ID":"5d2d2964-af79-4836-958f-bed6d6dd4a47","Type":"ContainerStarted","Data":"df5d94f25acb78aa04f6c2efc1a8cdd8272eb6b4c0f7875bdeba0149289f751a"} Apr 16 15:08:31.175935 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:31.175893 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6" podUID="8f0e4d64-6552-422e-b759-ef7870748314" containerName="model-chainer-raw-hpa-007f8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:08:31.176334 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:31.176010 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6" Apr 16 15:08:34.266185 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:34.266133 2565 generic.go:358] "Generic (PLEG): container finished" podID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerID="8dd23bc1d466f0cec0aaabbf74342e4a713391a4f117e3584e8810b5c2d9e27e" exitCode=0 Apr 16 15:08:34.266573 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:34.266193 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" event={"ID":"5d2d2964-af79-4836-958f-bed6d6dd4a47","Type":"ContainerDied","Data":"8dd23bc1d466f0cec0aaabbf74342e4a713391a4f117e3584e8810b5c2d9e27e"} Apr 16 15:08:35.270949 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:35.270911 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" event={"ID":"5d2d2964-af79-4836-958f-bed6d6dd4a47","Type":"ContainerStarted","Data":"261c790e1db7d7fb554c7a02ec95a058648d2e7198b0bbb41e88ebcf062ca2d6"} Apr 16 15:08:35.270949 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:35.270955 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" event={"ID":"5d2d2964-af79-4836-958f-bed6d6dd4a47","Type":"ContainerStarted","Data":"f2e04e9e1bdc47ef8c5722760840af0c5444c2d29bd50075436c9e27d0403608"} Apr 16 15:08:35.271495 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:35.271320 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" Apr 16 15:08:35.272568 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:35.272544 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 15:08:35.286301 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:35.286256 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" podStartSLOduration=6.2862423849999995 podStartE2EDuration="6.286242385s" podCreationTimestamp="2026-04-16 15:08:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:08:35.284934186 +0000 UTC m=+979.626065065" watchObservedRunningTime="2026-04-16 15:08:35.286242385 +0000 UTC m=+979.627373240" Apr 16 15:08:36.176703 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:36.176667 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6" podUID="8f0e4d64-6552-422e-b759-ef7870748314" containerName="model-chainer-raw-hpa-007f8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:08:36.276632 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:36.276602 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" Apr 16 15:08:36.277090 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:36.276723 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 15:08:36.277650 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:36.277624 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:08:37.279066 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:37.279025 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 15:08:37.279520 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:37.279384 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:08:41.176258 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:41.176222 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6" podUID="8f0e4d64-6552-422e-b759-ef7870748314" containerName="model-chainer-raw-hpa-007f8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:08:46.176064 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:46.176023 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6" podUID="8f0e4d64-6552-422e-b759-ef7870748314" containerName="model-chainer-raw-hpa-007f8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:08:47.279476 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:47.279422 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 15:08:47.279955 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:47.279930 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:08:49.320356 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:49.320319 2565 generic.go:358] "Generic (PLEG): container finished" podID="8f0e4d64-6552-422e-b759-ef7870748314" containerID="9bf11496e1bc7479b75c3512d65ab0221086228d900fea852115edb6f4694c60" exitCode=0 Apr 16 15:08:49.320738 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:49.320395 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6" event={"ID":"8f0e4d64-6552-422e-b759-ef7870748314","Type":"ContainerDied","Data":"9bf11496e1bc7479b75c3512d65ab0221086228d900fea852115edb6f4694c60"} Apr 16 15:08:49.796371 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:49.796348 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6" Apr 16 15:08:49.916662 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:49.916620 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f0e4d64-6552-422e-b759-ef7870748314-proxy-tls\") pod \"8f0e4d64-6552-422e-b759-ef7870748314\" (UID: \"8f0e4d64-6552-422e-b759-ef7870748314\") " Apr 16 15:08:49.916831 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:49.916685 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f0e4d64-6552-422e-b759-ef7870748314-openshift-service-ca-bundle\") pod \"8f0e4d64-6552-422e-b759-ef7870748314\" (UID: \"8f0e4d64-6552-422e-b759-ef7870748314\") " Apr 16 15:08:49.917056 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:49.917018 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f0e4d64-6552-422e-b759-ef7870748314-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "8f0e4d64-6552-422e-b759-ef7870748314" (UID: "8f0e4d64-6552-422e-b759-ef7870748314"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:08:49.918735 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:49.918712 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f0e4d64-6552-422e-b759-ef7870748314-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8f0e4d64-6552-422e-b759-ef7870748314" (UID: "8f0e4d64-6552-422e-b759-ef7870748314"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:08:50.017246 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:50.017124 2565 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f0e4d64-6552-422e-b759-ef7870748314-openshift-service-ca-bundle\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 15:08:50.017246 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:50.017158 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f0e4d64-6552-422e-b759-ef7870748314-proxy-tls\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 15:08:50.324834 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:50.324750 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6" Apr 16 15:08:50.326686 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:50.326661 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6" event={"ID":"8f0e4d64-6552-422e-b759-ef7870748314","Type":"ContainerDied","Data":"8983836702f1b00e5ba9584b2a5e565002c808c862ac2991af84745b35d5ee44"} Apr 16 15:08:50.326771 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:50.326699 2565 scope.go:117] "RemoveContainer" containerID="9bf11496e1bc7479b75c3512d65ab0221086228d900fea852115edb6f4694c60" Apr 16 15:08:50.347892 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:50.347864 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6"] Apr 16 15:08:50.351514 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:50.351490 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-007f8-774dcbc4f8-4hnw6"] Apr 16 15:08:52.326904 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:52.326871 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f0e4d64-6552-422e-b759-ef7870748314" path="/var/lib/kubelet/pods/8f0e4d64-6552-422e-b759-ef7870748314/volumes" Apr 16 15:08:57.279827 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:57.279779 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 15:08:57.280365 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:08:57.280313 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:09:07.279779 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:07.279723 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 15:09:07.280287 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:07.280217 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:09:17.279704 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:17.279642 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 15:09:17.280269 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:17.280232 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:09:27.279763 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:27.279711 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 15:09:27.280288 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:27.280100 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:09:37.279952 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:37.279897 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 15:09:37.280473 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:37.280442 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:09:47.280412 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:47.280383 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" Apr 16 15:09:47.280810 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:47.280440 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" Apr 16 15:09:54.414298 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:54.414268 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-116da-predictor-6578d49f9-hhb9v_dceed32f-55e0-4b72-bdda-061768e385ff/kserve-container/0.log" Apr 16 15:09:54.570531 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:54.570499 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv"] Apr 16 15:09:54.570805 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:54.570780 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="kserve-container" containerID="cri-o://f2e04e9e1bdc47ef8c5722760840af0c5444c2d29bd50075436c9e27d0403608" gracePeriod=30 Apr 16 15:09:54.570886 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:54.570823 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="agent" containerID="cri-o://261c790e1db7d7fb554c7a02ec95a058648d2e7198b0bbb41e88ebcf062ca2d6" gracePeriod=30 Apr 16 15:09:54.606048 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:54.606011 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25"] Apr 16 15:09:54.606408 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:54.606390 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f0e4d64-6552-422e-b759-ef7870748314" containerName="model-chainer-raw-hpa-007f8" Apr 16 15:09:54.606495 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:54.606409 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0e4d64-6552-422e-b759-ef7870748314" containerName="model-chainer-raw-hpa-007f8" Apr 16 15:09:54.606556 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:54.606504 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f0e4d64-6552-422e-b759-ef7870748314" containerName="model-chainer-raw-hpa-007f8" Apr 16 15:09:54.608852 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:54.608833 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" Apr 16 15:09:54.616089 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:54.616068 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25"] Apr 16 15:09:54.666820 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:54.666736 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-116da-predictor-6578d49f9-hhb9v"] Apr 16 15:09:54.667057 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:54.667025 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-116da-predictor-6578d49f9-hhb9v" podUID="dceed32f-55e0-4b72-bdda-061768e385ff" containerName="kserve-container" containerID="cri-o://9653863dcb858032daf6cbb46ec36d2e6560bd8301a692b3c1d5c8211cd469eb" gracePeriod=30 Apr 16 15:09:54.727093 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:54.727056 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0dc64f04-3d14-419a-8df4-dd29eb39b88d-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25\" (UID: \"0dc64f04-3d14-419a-8df4-dd29eb39b88d\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" Apr 16 15:09:54.827671 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:54.827635 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0dc64f04-3d14-419a-8df4-dd29eb39b88d-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25\" (UID: \"0dc64f04-3d14-419a-8df4-dd29eb39b88d\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" Apr 16 15:09:54.828011 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:54.827991 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0dc64f04-3d14-419a-8df4-dd29eb39b88d-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25\" (UID: \"0dc64f04-3d14-419a-8df4-dd29eb39b88d\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" Apr 16 15:09:54.907897 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:54.907874 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-116da-predictor-6578d49f9-hhb9v" Apr 16 15:09:54.918599 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:54.918525 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" Apr 16 15:09:55.038802 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:55.038769 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25"] Apr 16 15:09:55.041846 ip-10-0-142-46 kubenswrapper[2565]: W0416 15:09:55.041815 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dc64f04_3d14_419a_8df4_dd29eb39b88d.slice/crio-7d283e8a5a866889bd920b0fa74eae65f0e5a40f579b2c5e233f0ba80199414a WatchSource:0}: Error finding container 7d283e8a5a866889bd920b0fa74eae65f0e5a40f579b2c5e233f0ba80199414a: Status 404 returned error can't find the container with id 7d283e8a5a866889bd920b0fa74eae65f0e5a40f579b2c5e233f0ba80199414a Apr 16 15:09:55.516159 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:55.516119 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" event={"ID":"0dc64f04-3d14-419a-8df4-dd29eb39b88d","Type":"ContainerStarted","Data":"ebc27f903916b749edb1e5a8c3aad635c2f27f0ef1846528d2f5a04855e807a7"} Apr 16 15:09:55.516159 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:55.516158 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" event={"ID":"0dc64f04-3d14-419a-8df4-dd29eb39b88d","Type":"ContainerStarted","Data":"7d283e8a5a866889bd920b0fa74eae65f0e5a40f579b2c5e233f0ba80199414a"} Apr 16 15:09:55.517415 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:55.517382 2565 generic.go:358] "Generic (PLEG): container finished" podID="dceed32f-55e0-4b72-bdda-061768e385ff" containerID="9653863dcb858032daf6cbb46ec36d2e6560bd8301a692b3c1d5c8211cd469eb" exitCode=2 Apr 16 15:09:55.517561 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:55.517437 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-116da-predictor-6578d49f9-hhb9v" Apr 16 15:09:55.517561 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:55.517460 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-116da-predictor-6578d49f9-hhb9v" event={"ID":"dceed32f-55e0-4b72-bdda-061768e385ff","Type":"ContainerDied","Data":"9653863dcb858032daf6cbb46ec36d2e6560bd8301a692b3c1d5c8211cd469eb"} Apr 16 15:09:55.517561 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:55.517496 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-116da-predictor-6578d49f9-hhb9v" event={"ID":"dceed32f-55e0-4b72-bdda-061768e385ff","Type":"ContainerDied","Data":"0ca95581cd2bf3b056d8e439cd5b86249a5d6abfe3297e4042dd8105d7f849d1"} Apr 16 15:09:55.517561 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:55.517510 2565 scope.go:117] "RemoveContainer" containerID="9653863dcb858032daf6cbb46ec36d2e6560bd8301a692b3c1d5c8211cd469eb" Apr 16 15:09:55.532990 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:55.532972 2565 scope.go:117] "RemoveContainer" containerID="9653863dcb858032daf6cbb46ec36d2e6560bd8301a692b3c1d5c8211cd469eb" Apr 16 15:09:55.533279 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:09:55.533253 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9653863dcb858032daf6cbb46ec36d2e6560bd8301a692b3c1d5c8211cd469eb\": container with ID starting with 9653863dcb858032daf6cbb46ec36d2e6560bd8301a692b3c1d5c8211cd469eb not found: ID does not exist" containerID="9653863dcb858032daf6cbb46ec36d2e6560bd8301a692b3c1d5c8211cd469eb" Apr 16 15:09:55.533389 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:55.533285 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9653863dcb858032daf6cbb46ec36d2e6560bd8301a692b3c1d5c8211cd469eb"} err="failed to get container status \"9653863dcb858032daf6cbb46ec36d2e6560bd8301a692b3c1d5c8211cd469eb\": rpc error: code = NotFound desc = could not find container \"9653863dcb858032daf6cbb46ec36d2e6560bd8301a692b3c1d5c8211cd469eb\": container with ID starting with 9653863dcb858032daf6cbb46ec36d2e6560bd8301a692b3c1d5c8211cd469eb not found: ID does not exist" Apr 16 15:09:55.544595 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:55.544567 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-116da-predictor-6578d49f9-hhb9v"] Apr 16 15:09:55.546486 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:55.546459 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-116da-predictor-6578d49f9-hhb9v"] Apr 16 15:09:56.327214 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:56.327158 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dceed32f-55e0-4b72-bdda-061768e385ff" path="/var/lib/kubelet/pods/dceed32f-55e0-4b72-bdda-061768e385ff/volumes" Apr 16 15:09:57.279193 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:57.279127 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 15:09:57.279605 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:57.279496 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:09:59.534235 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:59.534204 2565 generic.go:358] "Generic (PLEG): container finished" podID="0dc64f04-3d14-419a-8df4-dd29eb39b88d" containerID="ebc27f903916b749edb1e5a8c3aad635c2f27f0ef1846528d2f5a04855e807a7" exitCode=0 Apr 16 15:09:59.534628 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:09:59.534278 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" event={"ID":"0dc64f04-3d14-419a-8df4-dd29eb39b88d","Type":"ContainerDied","Data":"ebc27f903916b749edb1e5a8c3aad635c2f27f0ef1846528d2f5a04855e807a7"} Apr 16 15:10:00.538751 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:00.538716 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" event={"ID":"0dc64f04-3d14-419a-8df4-dd29eb39b88d","Type":"ContainerStarted","Data":"0e9d3962e9410e6a0e3f03dd75138bdac22002bca3333e7e55b797e439ef1e0d"} Apr 16 15:10:00.539147 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:00.539017 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" Apr 16 15:10:00.540395 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:00.540369 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" podUID="0dc64f04-3d14-419a-8df4-dd29eb39b88d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 15:10:00.553423 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:00.553384 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" podStartSLOduration=6.553370354 podStartE2EDuration="6.553370354s" podCreationTimestamp="2026-04-16 15:09:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:10:00.552260231 +0000 UTC m=+1064.893391086" watchObservedRunningTime="2026-04-16 15:10:00.553370354 +0000 UTC m=+1064.894501210" Apr 16 15:10:01.542007 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:01.541969 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" podUID="0dc64f04-3d14-419a-8df4-dd29eb39b88d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 15:10:02.546561 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:02.546526 2565 generic.go:358] "Generic (PLEG): container finished" podID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerID="f2e04e9e1bdc47ef8c5722760840af0c5444c2d29bd50075436c9e27d0403608" exitCode=0 Apr 16 15:10:02.546959 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:02.546593 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" event={"ID":"5d2d2964-af79-4836-958f-bed6d6dd4a47","Type":"ContainerDied","Data":"f2e04e9e1bdc47ef8c5722760840af0c5444c2d29bd50075436c9e27d0403608"} Apr 16 15:10:07.279490 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:07.279380 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 15:10:07.279861 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:07.279698 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:10:11.542472 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:11.542433 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" podUID="0dc64f04-3d14-419a-8df4-dd29eb39b88d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 15:10:17.279393 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:17.279338 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 15:10:17.279855 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:17.279504 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" Apr 16 15:10:17.279855 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:17.279710 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:10:17.279855 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:17.279813 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" Apr 16 15:10:21.542335 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:21.542295 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" podUID="0dc64f04-3d14-419a-8df4-dd29eb39b88d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 15:10:24.610843 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:24.610800 2565 generic.go:358] "Generic (PLEG): container finished" podID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerID="261c790e1db7d7fb554c7a02ec95a058648d2e7198b0bbb41e88ebcf062ca2d6" exitCode=0 Apr 16 15:10:24.611220 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:24.610866 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" event={"ID":"5d2d2964-af79-4836-958f-bed6d6dd4a47","Type":"ContainerDied","Data":"261c790e1db7d7fb554c7a02ec95a058648d2e7198b0bbb41e88ebcf062ca2d6"} Apr 16 15:10:25.211380 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:25.211358 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" Apr 16 15:10:25.386872 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:25.386783 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d2d2964-af79-4836-958f-bed6d6dd4a47-kserve-provision-location\") pod \"5d2d2964-af79-4836-958f-bed6d6dd4a47\" (UID: \"5d2d2964-af79-4836-958f-bed6d6dd4a47\") " Apr 16 15:10:25.387131 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:25.387110 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d2d2964-af79-4836-958f-bed6d6dd4a47-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5d2d2964-af79-4836-958f-bed6d6dd4a47" (UID: "5d2d2964-af79-4836-958f-bed6d6dd4a47"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:10:25.488278 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:25.488245 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d2d2964-af79-4836-958f-bed6d6dd4a47-kserve-provision-location\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 15:10:25.615091 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:25.615054 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" event={"ID":"5d2d2964-af79-4836-958f-bed6d6dd4a47","Type":"ContainerDied","Data":"df5d94f25acb78aa04f6c2efc1a8cdd8272eb6b4c0f7875bdeba0149289f751a"} Apr 16 15:10:25.615091 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:25.615081 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv" Apr 16 15:10:25.615586 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:25.615111 2565 scope.go:117] "RemoveContainer" containerID="261c790e1db7d7fb554c7a02ec95a058648d2e7198b0bbb41e88ebcf062ca2d6" Apr 16 15:10:25.623458 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:25.623346 2565 scope.go:117] "RemoveContainer" containerID="f2e04e9e1bdc47ef8c5722760840af0c5444c2d29bd50075436c9e27d0403608" Apr 16 15:10:25.630388 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:25.630369 2565 scope.go:117] "RemoveContainer" containerID="8dd23bc1d466f0cec0aaabbf74342e4a713391a4f117e3584e8810b5c2d9e27e" Apr 16 15:10:25.636637 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:25.636610 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv"] Apr 16 15:10:25.640851 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:25.640826 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-116da-predictor-679df64688-9nfhv"] Apr 16 15:10:26.326413 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:26.326373 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" path="/var/lib/kubelet/pods/5d2d2964-af79-4836-958f-bed6d6dd4a47/volumes" Apr 16 15:10:31.542740 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:31.542688 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" podUID="0dc64f04-3d14-419a-8df4-dd29eb39b88d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 15:10:41.542411 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:41.542371 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" podUID="0dc64f04-3d14-419a-8df4-dd29eb39b88d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 15:10:51.542452 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:10:51.542405 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" podUID="0dc64f04-3d14-419a-8df4-dd29eb39b88d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 15:11:01.542906 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:11:01.542855 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" podUID="0dc64f04-3d14-419a-8df4-dd29eb39b88d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 15:11:11.542802 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:11:11.542757 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" podUID="0dc64f04-3d14-419a-8df4-dd29eb39b88d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 15:11:12.326581 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:11:12.326542 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" podUID="0dc64f04-3d14-419a-8df4-dd29eb39b88d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 15:11:22.327605 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:11:22.327562 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" podUID="0dc64f04-3d14-419a-8df4-dd29eb39b88d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 15:11:32.327024 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:11:32.326980 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" podUID="0dc64f04-3d14-419a-8df4-dd29eb39b88d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 15:11:42.326837 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:11:42.326742 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" podUID="0dc64f04-3d14-419a-8df4-dd29eb39b88d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 15:11:52.328237 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:11:52.328048 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" podUID="0dc64f04-3d14-419a-8df4-dd29eb39b88d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 15:12:02.326961 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:02.326911 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" podUID="0dc64f04-3d14-419a-8df4-dd29eb39b88d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 15:12:12.327008 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:12.326965 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" podUID="0dc64f04-3d14-419a-8df4-dd29eb39b88d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 15:12:16.282537 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:16.282508 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4krl_84ace2a9-8bcc-47b5-81bb-c764aa280104/ovn-acl-logging/0.log" Apr 16 15:12:16.283806 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:16.283784 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4krl_84ace2a9-8bcc-47b5-81bb-c764aa280104/ovn-acl-logging/0.log" Apr 16 15:12:22.326475 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:22.326444 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" Apr 16 15:12:24.801159 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:24.801123 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25"] Apr 16 15:12:24.801590 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:24.801451 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" podUID="0dc64f04-3d14-419a-8df4-dd29eb39b88d" containerName="kserve-container" containerID="cri-o://0e9d3962e9410e6a0e3f03dd75138bdac22002bca3333e7e55b797e439ef1e0d" gracePeriod=30 Apr 16 15:12:24.885626 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:24.885595 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp"] Apr 16 15:12:24.885968 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:24.885950 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dceed32f-55e0-4b72-bdda-061768e385ff" containerName="kserve-container" Apr 16 15:12:24.886051 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:24.885971 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="dceed32f-55e0-4b72-bdda-061768e385ff" containerName="kserve-container" Apr 16 15:12:24.886051 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:24.885996 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="agent" Apr 16 15:12:24.886051 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:24.886004 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="agent" Apr 16 15:12:24.886051 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:24.886023 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="storage-initializer" Apr 16 15:12:24.886051 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:24.886032 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="storage-initializer" Apr 16 15:12:24.886326 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:24.886053 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="kserve-container" Apr 16 15:12:24.886326 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:24.886061 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="kserve-container" Apr 16 15:12:24.886326 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:24.886129 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="kserve-container" Apr 16 15:12:24.886326 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:24.886141 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d2d2964-af79-4836-958f-bed6d6dd4a47" containerName="agent" Apr 16 15:12:24.886326 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:24.886154 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="dceed32f-55e0-4b72-bdda-061768e385ff" containerName="kserve-container" Apr 16 15:12:24.889360 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:24.889338 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp" Apr 16 15:12:24.895980 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:24.895940 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp"] Apr 16 15:12:24.939043 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:24.939004 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ecd02460-5936-48c3-b169-fd8a14fe81d7-kserve-provision-location\") pod \"isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp\" (UID: \"ecd02460-5936-48c3-b169-fd8a14fe81d7\") " pod="kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp" Apr 16 15:12:25.039942 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:25.039900 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ecd02460-5936-48c3-b169-fd8a14fe81d7-kserve-provision-location\") pod \"isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp\" (UID: \"ecd02460-5936-48c3-b169-fd8a14fe81d7\") " pod="kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp" Apr 16 15:12:25.040344 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:25.040321 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ecd02460-5936-48c3-b169-fd8a14fe81d7-kserve-provision-location\") pod \"isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp\" (UID: \"ecd02460-5936-48c3-b169-fd8a14fe81d7\") " pod="kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp" Apr 16 15:12:25.201126 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:25.201096 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp" Apr 16 15:12:25.320148 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:25.320115 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp"] Apr 16 15:12:25.323123 ip-10-0-142-46 kubenswrapper[2565]: W0416 15:12:25.323093 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecd02460_5936_48c3_b169_fd8a14fe81d7.slice/crio-8a30f02fbe9ae9d48f8e42fb540e217c9e1fd4c2acdbf48cbfaf564b47743fd4 WatchSource:0}: Error finding container 8a30f02fbe9ae9d48f8e42fb540e217c9e1fd4c2acdbf48cbfaf564b47743fd4: Status 404 returned error can't find the container with id 8a30f02fbe9ae9d48f8e42fb540e217c9e1fd4c2acdbf48cbfaf564b47743fd4 Apr 16 15:12:25.324944 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:25.324926 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:12:25.961266 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:25.961228 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp" event={"ID":"ecd02460-5936-48c3-b169-fd8a14fe81d7","Type":"ContainerStarted","Data":"91656522ad345eb08eabf3be81d4e2ae735f9806e51b5d4e515e8cc8a83c0aee"} Apr 16 15:12:25.961266 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:25.961267 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp" event={"ID":"ecd02460-5936-48c3-b169-fd8a14fe81d7","Type":"ContainerStarted","Data":"8a30f02fbe9ae9d48f8e42fb540e217c9e1fd4c2acdbf48cbfaf564b47743fd4"} Apr 16 15:12:29.974100 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:29.974067 2565 generic.go:358] "Generic (PLEG): container finished" podID="ecd02460-5936-48c3-b169-fd8a14fe81d7" containerID="91656522ad345eb08eabf3be81d4e2ae735f9806e51b5d4e515e8cc8a83c0aee" exitCode=0 Apr 16 15:12:29.974498 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:29.974127 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp" event={"ID":"ecd02460-5936-48c3-b169-fd8a14fe81d7","Type":"ContainerDied","Data":"91656522ad345eb08eabf3be81d4e2ae735f9806e51b5d4e515e8cc8a83c0aee"} Apr 16 15:12:30.978708 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:30.978670 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp" event={"ID":"ecd02460-5936-48c3-b169-fd8a14fe81d7","Type":"ContainerStarted","Data":"59f352de53b7c6bb3ff56714c78f5396ce5718a8885021bd53a68b5cbf5d9750"} Apr 16 15:12:30.979095 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:30.978953 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp" Apr 16 15:12:30.980387 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:30.980361 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp" podUID="ecd02460-5936-48c3-b169-fd8a14fe81d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 15:12:30.996163 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:30.996118 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp" podStartSLOduration=6.996100898 podStartE2EDuration="6.996100898s" podCreationTimestamp="2026-04-16 15:12:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:12:30.994244724 +0000 UTC m=+1215.335375579" watchObservedRunningTime="2026-04-16 15:12:30.996100898 +0000 UTC m=+1215.337231756" Apr 16 15:12:31.982223 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:31.982152 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp" podUID="ecd02460-5936-48c3-b169-fd8a14fe81d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 15:12:32.323468 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:32.323372 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" podUID="0dc64f04-3d14-419a-8df4-dd29eb39b88d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 15:12:34.244145 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:34.244123 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" Apr 16 15:12:34.320102 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:34.320018 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0dc64f04-3d14-419a-8df4-dd29eb39b88d-kserve-provision-location\") pod \"0dc64f04-3d14-419a-8df4-dd29eb39b88d\" (UID: \"0dc64f04-3d14-419a-8df4-dd29eb39b88d\") " Apr 16 15:12:34.320377 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:34.320350 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dc64f04-3d14-419a-8df4-dd29eb39b88d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0dc64f04-3d14-419a-8df4-dd29eb39b88d" (UID: "0dc64f04-3d14-419a-8df4-dd29eb39b88d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:12:34.421548 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:34.421508 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0dc64f04-3d14-419a-8df4-dd29eb39b88d-kserve-provision-location\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 15:12:34.990120 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:34.990080 2565 generic.go:358] "Generic (PLEG): container finished" podID="0dc64f04-3d14-419a-8df4-dd29eb39b88d" containerID="0e9d3962e9410e6a0e3f03dd75138bdac22002bca3333e7e55b797e439ef1e0d" exitCode=0 Apr 16 15:12:34.990306 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:34.990134 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" event={"ID":"0dc64f04-3d14-419a-8df4-dd29eb39b88d","Type":"ContainerDied","Data":"0e9d3962e9410e6a0e3f03dd75138bdac22002bca3333e7e55b797e439ef1e0d"} Apr 16 15:12:34.990306 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:34.990161 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" Apr 16 15:12:34.990306 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:34.990166 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25" event={"ID":"0dc64f04-3d14-419a-8df4-dd29eb39b88d","Type":"ContainerDied","Data":"7d283e8a5a866889bd920b0fa74eae65f0e5a40f579b2c5e233f0ba80199414a"} Apr 16 15:12:34.990306 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:34.990214 2565 scope.go:117] "RemoveContainer" containerID="0e9d3962e9410e6a0e3f03dd75138bdac22002bca3333e7e55b797e439ef1e0d" Apr 16 15:12:34.998034 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:34.998010 2565 scope.go:117] "RemoveContainer" containerID="ebc27f903916b749edb1e5a8c3aad635c2f27f0ef1846528d2f5a04855e807a7" Apr 16 15:12:35.006126 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:35.006099 2565 scope.go:117] "RemoveContainer" containerID="0e9d3962e9410e6a0e3f03dd75138bdac22002bca3333e7e55b797e439ef1e0d" Apr 16 15:12:35.006480 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:12:35.006455 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e9d3962e9410e6a0e3f03dd75138bdac22002bca3333e7e55b797e439ef1e0d\": container with ID starting with 0e9d3962e9410e6a0e3f03dd75138bdac22002bca3333e7e55b797e439ef1e0d not found: ID does not exist" containerID="0e9d3962e9410e6a0e3f03dd75138bdac22002bca3333e7e55b797e439ef1e0d" Apr 16 15:12:35.006562 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:35.006491 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e9d3962e9410e6a0e3f03dd75138bdac22002bca3333e7e55b797e439ef1e0d"} err="failed to get container status \"0e9d3962e9410e6a0e3f03dd75138bdac22002bca3333e7e55b797e439ef1e0d\": rpc error: code = NotFound desc = could not find container \"0e9d3962e9410e6a0e3f03dd75138bdac22002bca3333e7e55b797e439ef1e0d\": container with ID starting with 0e9d3962e9410e6a0e3f03dd75138bdac22002bca3333e7e55b797e439ef1e0d not found: ID does not exist" Apr 16 15:12:35.006562 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:35.006519 2565 scope.go:117] "RemoveContainer" containerID="ebc27f903916b749edb1e5a8c3aad635c2f27f0ef1846528d2f5a04855e807a7" Apr 16 15:12:35.006775 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:12:35.006755 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebc27f903916b749edb1e5a8c3aad635c2f27f0ef1846528d2f5a04855e807a7\": container with ID starting with ebc27f903916b749edb1e5a8c3aad635c2f27f0ef1846528d2f5a04855e807a7 not found: ID does not exist" containerID="ebc27f903916b749edb1e5a8c3aad635c2f27f0ef1846528d2f5a04855e807a7" Apr 16 15:12:35.006842 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:35.006783 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebc27f903916b749edb1e5a8c3aad635c2f27f0ef1846528d2f5a04855e807a7"} err="failed to get container status \"ebc27f903916b749edb1e5a8c3aad635c2f27f0ef1846528d2f5a04855e807a7\": rpc error: code = NotFound desc = could not find container \"ebc27f903916b749edb1e5a8c3aad635c2f27f0ef1846528d2f5a04855e807a7\": container with ID starting with ebc27f903916b749edb1e5a8c3aad635c2f27f0ef1846528d2f5a04855e807a7 not found: ID does not exist" Apr 16 15:12:35.006984 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:35.006964 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25"] Apr 16 15:12:35.010593 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:35.010563 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-ac40b-predictor-56648db454-vwl25"] Apr 16 15:12:36.331050 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:36.331015 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dc64f04-3d14-419a-8df4-dd29eb39b88d" path="/var/lib/kubelet/pods/0dc64f04-3d14-419a-8df4-dd29eb39b88d/volumes" Apr 16 15:12:41.983070 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:41.983025 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp" podUID="ecd02460-5936-48c3-b169-fd8a14fe81d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 15:12:51.982514 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:12:51.982465 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp" podUID="ecd02460-5936-48c3-b169-fd8a14fe81d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 15:13:01.982948 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:01.982904 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp" podUID="ecd02460-5936-48c3-b169-fd8a14fe81d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 15:13:11.982840 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:11.982740 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp" podUID="ecd02460-5936-48c3-b169-fd8a14fe81d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 15:13:21.982433 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:21.982379 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp" podUID="ecd02460-5936-48c3-b169-fd8a14fe81d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 15:13:31.982832 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:31.982783 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp" podUID="ecd02460-5936-48c3-b169-fd8a14fe81d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 15:13:39.324143 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:39.324104 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp" Apr 16 15:13:44.999287 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:44.999250 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv"] Apr 16 15:13:44.999668 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:44.999572 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0dc64f04-3d14-419a-8df4-dd29eb39b88d" containerName="storage-initializer" Apr 16 15:13:44.999668 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:44.999584 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dc64f04-3d14-419a-8df4-dd29eb39b88d" containerName="storage-initializer" Apr 16 15:13:44.999668 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:44.999592 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0dc64f04-3d14-419a-8df4-dd29eb39b88d" containerName="kserve-container" Apr 16 15:13:44.999668 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:44.999598 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dc64f04-3d14-419a-8df4-dd29eb39b88d" containerName="kserve-container" Apr 16 15:13:44.999668 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:44.999650 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="0dc64f04-3d14-419a-8df4-dd29eb39b88d" containerName="kserve-container" Apr 16 15:13:45.002559 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:45.002543 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv" Apr 16 15:13:45.004756 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:45.004729 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-f9fec2\"" Apr 16 15:13:45.004881 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:45.004855 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 15:13:45.005089 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:45.005076 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-f9fec2-dockercfg-k4mpw\"" Apr 16 15:13:45.009938 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:45.009892 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv"] Apr 16 15:13:45.189454 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:45.189420 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71dbcd77-e56b-47d2-9890-a4892849c4b5-kserve-provision-location\") pod \"isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv\" (UID: \"71dbcd77-e56b-47d2-9890-a4892849c4b5\") " pod="kserve-ci-e2e-test/isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv" Apr 16 15:13:45.189639 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:45.189475 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/71dbcd77-e56b-47d2-9890-a4892849c4b5-cabundle-cert\") pod \"isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv\" (UID: \"71dbcd77-e56b-47d2-9890-a4892849c4b5\") " pod="kserve-ci-e2e-test/isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv" Apr 16 15:13:45.290804 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:45.290720 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71dbcd77-e56b-47d2-9890-a4892849c4b5-kserve-provision-location\") pod \"isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv\" (UID: \"71dbcd77-e56b-47d2-9890-a4892849c4b5\") " pod="kserve-ci-e2e-test/isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv" Apr 16 15:13:45.290804 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:45.290794 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/71dbcd77-e56b-47d2-9890-a4892849c4b5-cabundle-cert\") pod \"isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv\" (UID: \"71dbcd77-e56b-47d2-9890-a4892849c4b5\") " pod="kserve-ci-e2e-test/isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv" Apr 16 15:13:45.291125 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:45.291104 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71dbcd77-e56b-47d2-9890-a4892849c4b5-kserve-provision-location\") pod \"isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv\" (UID: \"71dbcd77-e56b-47d2-9890-a4892849c4b5\") " pod="kserve-ci-e2e-test/isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv" Apr 16 15:13:45.291389 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:45.291372 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/71dbcd77-e56b-47d2-9890-a4892849c4b5-cabundle-cert\") pod \"isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv\" (UID: \"71dbcd77-e56b-47d2-9890-a4892849c4b5\") " pod="kserve-ci-e2e-test/isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv" Apr 16 15:13:45.313967 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:45.313938 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv" Apr 16 15:13:45.436584 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:45.436557 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv"] Apr 16 15:13:45.439264 ip-10-0-142-46 kubenswrapper[2565]: W0416 15:13:45.439234 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71dbcd77_e56b_47d2_9890_a4892849c4b5.slice/crio-a3ad66823717eb65553a0674375f4785072781d57d0129413d68ffd23401a844 WatchSource:0}: Error finding container a3ad66823717eb65553a0674375f4785072781d57d0129413d68ffd23401a844: Status 404 returned error can't find the container with id a3ad66823717eb65553a0674375f4785072781d57d0129413d68ffd23401a844 Apr 16 15:13:46.185877 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:46.185843 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv" event={"ID":"71dbcd77-e56b-47d2-9890-a4892849c4b5","Type":"ContainerStarted","Data":"9f208d74c41b9c69ca7ad5582052e4e7ed1703f4ecdb5d0923418ca05c758a3e"} Apr 16 15:13:46.185877 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:46.185878 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv" event={"ID":"71dbcd77-e56b-47d2-9890-a4892849c4b5","Type":"ContainerStarted","Data":"a3ad66823717eb65553a0674375f4785072781d57d0129413d68ffd23401a844"} Apr 16 15:13:49.196247 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:49.196224 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv_71dbcd77-e56b-47d2-9890-a4892849c4b5/storage-initializer/0.log" Apr 16 15:13:49.196592 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:49.196261 2565 generic.go:358] "Generic (PLEG): container finished" podID="71dbcd77-e56b-47d2-9890-a4892849c4b5" containerID="9f208d74c41b9c69ca7ad5582052e4e7ed1703f4ecdb5d0923418ca05c758a3e" exitCode=1 Apr 16 15:13:49.196592 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:49.196334 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv" event={"ID":"71dbcd77-e56b-47d2-9890-a4892849c4b5","Type":"ContainerDied","Data":"9f208d74c41b9c69ca7ad5582052e4e7ed1703f4ecdb5d0923418ca05c758a3e"} Apr 16 15:13:50.200994 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:50.200961 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv_71dbcd77-e56b-47d2-9890-a4892849c4b5/storage-initializer/0.log" Apr 16 15:13:50.201405 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:50.201011 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv" event={"ID":"71dbcd77-e56b-47d2-9890-a4892849c4b5","Type":"ContainerStarted","Data":"302b150934ac233d0a44b695c239a65c209e8bb593b4a44611f9ce9da16fbf83"} Apr 16 15:13:53.211362 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:53.211331 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv_71dbcd77-e56b-47d2-9890-a4892849c4b5/storage-initializer/1.log" Apr 16 15:13:53.211795 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:53.211739 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv_71dbcd77-e56b-47d2-9890-a4892849c4b5/storage-initializer/0.log" Apr 16 15:13:53.211795 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:53.211773 2565 generic.go:358] "Generic (PLEG): container finished" podID="71dbcd77-e56b-47d2-9890-a4892849c4b5" containerID="302b150934ac233d0a44b695c239a65c209e8bb593b4a44611f9ce9da16fbf83" exitCode=1 Apr 16 15:13:53.211880 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:53.211855 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv" event={"ID":"71dbcd77-e56b-47d2-9890-a4892849c4b5","Type":"ContainerDied","Data":"302b150934ac233d0a44b695c239a65c209e8bb593b4a44611f9ce9da16fbf83"} Apr 16 15:13:53.211914 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:53.211904 2565 scope.go:117] "RemoveContainer" containerID="9f208d74c41b9c69ca7ad5582052e4e7ed1703f4ecdb5d0923418ca05c758a3e" Apr 16 15:13:53.212315 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:53.212298 2565 scope.go:117] "RemoveContainer" containerID="9f208d74c41b9c69ca7ad5582052e4e7ed1703f4ecdb5d0923418ca05c758a3e" Apr 16 15:13:53.222086 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:13:53.222050 2565 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv_kserve-ci-e2e-test_71dbcd77-e56b-47d2-9890-a4892849c4b5_0 in pod sandbox a3ad66823717eb65553a0674375f4785072781d57d0129413d68ffd23401a844 from index: no such id: '9f208d74c41b9c69ca7ad5582052e4e7ed1703f4ecdb5d0923418ca05c758a3e'" containerID="9f208d74c41b9c69ca7ad5582052e4e7ed1703f4ecdb5d0923418ca05c758a3e" Apr 16 15:13:53.222149 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:13:53.222118 2565 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv_kserve-ci-e2e-test_71dbcd77-e56b-47d2-9890-a4892849c4b5_0 in pod sandbox a3ad66823717eb65553a0674375f4785072781d57d0129413d68ffd23401a844 from index: no such id: '9f208d74c41b9c69ca7ad5582052e4e7ed1703f4ecdb5d0923418ca05c758a3e'; Skipping pod \"isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv_kserve-ci-e2e-test(71dbcd77-e56b-47d2-9890-a4892849c4b5)\"" logger="UnhandledError" Apr 16 15:13:53.223467 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:13:53.223444 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv_kserve-ci-e2e-test(71dbcd77-e56b-47d2-9890-a4892849c4b5)\"" pod="kserve-ci-e2e-test/isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv" podUID="71dbcd77-e56b-47d2-9890-a4892849c4b5" Apr 16 15:13:54.215479 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:13:54.215449 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv_71dbcd77-e56b-47d2-9890-a4892849c4b5/storage-initializer/1.log" Apr 16 15:14:01.098250 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.098207 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv"] Apr 16 15:14:01.141768 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.141730 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp"] Apr 16 15:14:01.142312 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.142258 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp" podUID="ecd02460-5936-48c3-b169-fd8a14fe81d7" containerName="kserve-container" containerID="cri-o://59f352de53b7c6bb3ff56714c78f5396ce5718a8885021bd53a68b5cbf5d9750" gracePeriod=30 Apr 16 15:14:01.208319 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.208285 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh"] Apr 16 15:14:01.212818 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.212793 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh" Apr 16 15:14:01.214935 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.214912 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-8c321a\"" Apr 16 15:14:01.215049 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.214972 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-8c321a-dockercfg-t7pc9\"" Apr 16 15:14:01.220636 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.220450 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh"] Apr 16 15:14:01.233949 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.233930 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv_71dbcd77-e56b-47d2-9890-a4892849c4b5/storage-initializer/1.log" Apr 16 15:14:01.234058 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.233988 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv" Apr 16 15:14:01.238000 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.237983 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv_71dbcd77-e56b-47d2-9890-a4892849c4b5/storage-initializer/1.log" Apr 16 15:14:01.238132 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.238030 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv" event={"ID":"71dbcd77-e56b-47d2-9890-a4892849c4b5","Type":"ContainerDied","Data":"a3ad66823717eb65553a0674375f4785072781d57d0129413d68ffd23401a844"} Apr 16 15:14:01.238132 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.238063 2565 scope.go:117] "RemoveContainer" containerID="302b150934ac233d0a44b695c239a65c209e8bb593b4a44611f9ce9da16fbf83" Apr 16 15:14:01.238132 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.238087 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv" Apr 16 15:14:01.308756 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.308710 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71dbcd77-e56b-47d2-9890-a4892849c4b5-kserve-provision-location\") pod \"71dbcd77-e56b-47d2-9890-a4892849c4b5\" (UID: \"71dbcd77-e56b-47d2-9890-a4892849c4b5\") " Apr 16 15:14:01.308952 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.308790 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/71dbcd77-e56b-47d2-9890-a4892849c4b5-cabundle-cert\") pod \"71dbcd77-e56b-47d2-9890-a4892849c4b5\" (UID: \"71dbcd77-e56b-47d2-9890-a4892849c4b5\") " Apr 16 15:14:01.308952 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.308902 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b-cabundle-cert\") pod \"isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh\" (UID: \"8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b\") " pod="kserve-ci-e2e-test/isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh" Apr 16 15:14:01.309064 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.308964 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b-kserve-provision-location\") pod \"isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh\" (UID: \"8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b\") " pod="kserve-ci-e2e-test/isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh" Apr 16 15:14:01.309064 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.309024 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71dbcd77-e56b-47d2-9890-a4892849c4b5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "71dbcd77-e56b-47d2-9890-a4892849c4b5" (UID: "71dbcd77-e56b-47d2-9890-a4892849c4b5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:14:01.309149 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.309129 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71dbcd77-e56b-47d2-9890-a4892849c4b5-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "71dbcd77-e56b-47d2-9890-a4892849c4b5" (UID: "71dbcd77-e56b-47d2-9890-a4892849c4b5"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:14:01.409825 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.409777 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b-cabundle-cert\") pod \"isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh\" (UID: \"8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b\") " pod="kserve-ci-e2e-test/isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh" Apr 16 15:14:01.410012 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.409857 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b-kserve-provision-location\") pod \"isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh\" (UID: \"8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b\") " pod="kserve-ci-e2e-test/isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh" Apr 16 15:14:01.410012 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.409891 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71dbcd77-e56b-47d2-9890-a4892849c4b5-kserve-provision-location\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 15:14:01.410012 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.409903 2565 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/71dbcd77-e56b-47d2-9890-a4892849c4b5-cabundle-cert\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 15:14:01.410281 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.410264 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b-kserve-provision-location\") pod \"isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh\" (UID: \"8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b\") " pod="kserve-ci-e2e-test/isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh" Apr 16 15:14:01.410543 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.410519 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b-cabundle-cert\") pod \"isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh\" (UID: \"8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b\") " pod="kserve-ci-e2e-test/isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh" Apr 16 15:14:01.533077 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.533037 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh" Apr 16 15:14:01.579673 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.579636 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv"] Apr 16 15:14:01.583884 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.583856 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-f9fec2-predictor-56fc78594d-hj4lv"] Apr 16 15:14:01.653482 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:01.653453 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh"] Apr 16 15:14:01.656641 ip-10-0-142-46 kubenswrapper[2565]: W0416 15:14:01.656613 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ab9a5f0_db9c_4c1b_ba7d_4ab756ac0d5b.slice/crio-bb426e90ed6959aafd60a762e8d1383b059010a52f900183a5394f50f2f3fada WatchSource:0}: Error finding container bb426e90ed6959aafd60a762e8d1383b059010a52f900183a5394f50f2f3fada: Status 404 returned error can't find the container with id bb426e90ed6959aafd60a762e8d1383b059010a52f900183a5394f50f2f3fada Apr 16 15:14:02.243924 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:02.243881 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh" event={"ID":"8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b","Type":"ContainerStarted","Data":"9628bcc7cd3a590879cff692bdddfe7cff80e25e910ccece423efa33d56e743b"} Apr 16 15:14:02.243924 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:02.243925 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh" event={"ID":"8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b","Type":"ContainerStarted","Data":"bb426e90ed6959aafd60a762e8d1383b059010a52f900183a5394f50f2f3fada"} Apr 16 15:14:02.327290 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:02.327252 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71dbcd77-e56b-47d2-9890-a4892849c4b5" path="/var/lib/kubelet/pods/71dbcd77-e56b-47d2-9890-a4892849c4b5/volumes" Apr 16 15:14:05.585295 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:05.585265 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp" Apr 16 15:14:05.644862 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:05.644831 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ecd02460-5936-48c3-b169-fd8a14fe81d7-kserve-provision-location\") pod \"ecd02460-5936-48c3-b169-fd8a14fe81d7\" (UID: \"ecd02460-5936-48c3-b169-fd8a14fe81d7\") " Apr 16 15:14:05.645199 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:05.645152 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecd02460-5936-48c3-b169-fd8a14fe81d7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ecd02460-5936-48c3-b169-fd8a14fe81d7" (UID: "ecd02460-5936-48c3-b169-fd8a14fe81d7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:14:05.746073 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:05.745986 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ecd02460-5936-48c3-b169-fd8a14fe81d7-kserve-provision-location\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 15:14:06.256373 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:06.256341 2565 generic.go:358] "Generic (PLEG): container finished" podID="ecd02460-5936-48c3-b169-fd8a14fe81d7" containerID="59f352de53b7c6bb3ff56714c78f5396ce5718a8885021bd53a68b5cbf5d9750" exitCode=0 Apr 16 15:14:06.256586 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:06.256407 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp" event={"ID":"ecd02460-5936-48c3-b169-fd8a14fe81d7","Type":"ContainerDied","Data":"59f352de53b7c6bb3ff56714c78f5396ce5718a8885021bd53a68b5cbf5d9750"} Apr 16 15:14:06.256586 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:06.256408 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp" Apr 16 15:14:06.256586 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:06.256433 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp" event={"ID":"ecd02460-5936-48c3-b169-fd8a14fe81d7","Type":"ContainerDied","Data":"8a30f02fbe9ae9d48f8e42fb540e217c9e1fd4c2acdbf48cbfaf564b47743fd4"} Apr 16 15:14:06.256586 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:06.256450 2565 scope.go:117] "RemoveContainer" containerID="59f352de53b7c6bb3ff56714c78f5396ce5718a8885021bd53a68b5cbf5d9750" Apr 16 15:14:06.257950 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:06.257928 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh_8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b/storage-initializer/0.log" Apr 16 15:14:06.258063 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:06.257966 2565 generic.go:358] "Generic (PLEG): container finished" podID="8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b" containerID="9628bcc7cd3a590879cff692bdddfe7cff80e25e910ccece423efa33d56e743b" exitCode=1 Apr 16 15:14:06.258063 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:06.258039 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh" event={"ID":"8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b","Type":"ContainerDied","Data":"9628bcc7cd3a590879cff692bdddfe7cff80e25e910ccece423efa33d56e743b"} Apr 16 15:14:06.265709 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:06.265685 2565 scope.go:117] "RemoveContainer" containerID="91656522ad345eb08eabf3be81d4e2ae735f9806e51b5d4e515e8cc8a83c0aee" Apr 16 15:14:06.272690 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:06.272669 2565 scope.go:117] "RemoveContainer" containerID="59f352de53b7c6bb3ff56714c78f5396ce5718a8885021bd53a68b5cbf5d9750" Apr 16 15:14:06.273019 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:14:06.272999 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59f352de53b7c6bb3ff56714c78f5396ce5718a8885021bd53a68b5cbf5d9750\": container with ID starting with 59f352de53b7c6bb3ff56714c78f5396ce5718a8885021bd53a68b5cbf5d9750 not found: ID does not exist" containerID="59f352de53b7c6bb3ff56714c78f5396ce5718a8885021bd53a68b5cbf5d9750" Apr 16 15:14:06.273067 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:06.273030 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59f352de53b7c6bb3ff56714c78f5396ce5718a8885021bd53a68b5cbf5d9750"} err="failed to get container status \"59f352de53b7c6bb3ff56714c78f5396ce5718a8885021bd53a68b5cbf5d9750\": rpc error: code = NotFound desc = could not find container \"59f352de53b7c6bb3ff56714c78f5396ce5718a8885021bd53a68b5cbf5d9750\": container with ID starting with 59f352de53b7c6bb3ff56714c78f5396ce5718a8885021bd53a68b5cbf5d9750 not found: ID does not exist" Apr 16 15:14:06.273067 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:06.273051 2565 scope.go:117] "RemoveContainer" containerID="91656522ad345eb08eabf3be81d4e2ae735f9806e51b5d4e515e8cc8a83c0aee" Apr 16 15:14:06.273319 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:14:06.273297 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91656522ad345eb08eabf3be81d4e2ae735f9806e51b5d4e515e8cc8a83c0aee\": container with ID starting with 91656522ad345eb08eabf3be81d4e2ae735f9806e51b5d4e515e8cc8a83c0aee not found: ID does not exist" containerID="91656522ad345eb08eabf3be81d4e2ae735f9806e51b5d4e515e8cc8a83c0aee" Apr 16 15:14:06.273411 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:06.273321 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91656522ad345eb08eabf3be81d4e2ae735f9806e51b5d4e515e8cc8a83c0aee"} err="failed to get container status \"91656522ad345eb08eabf3be81d4e2ae735f9806e51b5d4e515e8cc8a83c0aee\": rpc error: code = NotFound desc = could not find container \"91656522ad345eb08eabf3be81d4e2ae735f9806e51b5d4e515e8cc8a83c0aee\": container with ID starting with 91656522ad345eb08eabf3be81d4e2ae735f9806e51b5d4e515e8cc8a83c0aee not found: ID does not exist" Apr 16 15:14:06.287977 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:06.287948 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp"] Apr 16 15:14:06.291368 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:06.291331 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-f9fec2-predictor-784f9bbf45-cgzlp"] Apr 16 15:14:06.326587 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:06.326558 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecd02460-5936-48c3-b169-fd8a14fe81d7" path="/var/lib/kubelet/pods/ecd02460-5936-48c3-b169-fd8a14fe81d7/volumes" Apr 16 15:14:07.263273 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:07.263245 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh_8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b/storage-initializer/0.log" Apr 16 15:14:07.263678 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:07.263306 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh" event={"ID":"8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b","Type":"ContainerStarted","Data":"fa32207436e4d3014ed348b0e78dd2edb468416171859e6485d5b67d2e3a8544"} Apr 16 15:14:09.270749 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:09.270721 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh_8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b/storage-initializer/1.log" Apr 16 15:14:09.271156 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:09.271110 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh_8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b/storage-initializer/0.log" Apr 16 15:14:09.271156 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:09.271143 2565 generic.go:358] "Generic (PLEG): container finished" podID="8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b" containerID="fa32207436e4d3014ed348b0e78dd2edb468416171859e6485d5b67d2e3a8544" exitCode=1 Apr 16 15:14:09.271286 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:09.271226 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh" event={"ID":"8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b","Type":"ContainerDied","Data":"fa32207436e4d3014ed348b0e78dd2edb468416171859e6485d5b67d2e3a8544"} Apr 16 15:14:09.271341 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:09.271284 2565 scope.go:117] "RemoveContainer" containerID="9628bcc7cd3a590879cff692bdddfe7cff80e25e910ccece423efa33d56e743b" Apr 16 15:14:09.271668 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:09.271647 2565 scope.go:117] "RemoveContainer" containerID="9628bcc7cd3a590879cff692bdddfe7cff80e25e910ccece423efa33d56e743b" Apr 16 15:14:09.281523 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:14:09.281497 2565 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh_kserve-ci-e2e-test_8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b_0 in pod sandbox bb426e90ed6959aafd60a762e8d1383b059010a52f900183a5394f50f2f3fada from index: no such id: '9628bcc7cd3a590879cff692bdddfe7cff80e25e910ccece423efa33d56e743b'" containerID="9628bcc7cd3a590879cff692bdddfe7cff80e25e910ccece423efa33d56e743b" Apr 16 15:14:09.281606 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:14:09.281545 2565 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh_kserve-ci-e2e-test_8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b_0 in pod sandbox bb426e90ed6959aafd60a762e8d1383b059010a52f900183a5394f50f2f3fada from index: no such id: '9628bcc7cd3a590879cff692bdddfe7cff80e25e910ccece423efa33d56e743b'; Skipping pod \"isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh_kserve-ci-e2e-test(8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b)\"" logger="UnhandledError" Apr 16 15:14:09.282900 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:14:09.282880 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh_kserve-ci-e2e-test(8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b)\"" pod="kserve-ci-e2e-test/isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh" podUID="8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b" Apr 16 15:14:10.276287 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:10.276254 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh_8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b/storage-initializer/1.log" Apr 16 15:14:11.221758 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.221726 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh"] Apr 16 15:14:11.348277 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.348255 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh_8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b/storage-initializer/1.log" Apr 16 15:14:11.348624 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.348316 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh" Apr 16 15:14:11.363575 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.363548 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p"] Apr 16 15:14:11.363837 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.363824 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b" containerName="storage-initializer" Apr 16 15:14:11.363877 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.363839 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b" containerName="storage-initializer" Apr 16 15:14:11.363877 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.363851 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b" containerName="storage-initializer" Apr 16 15:14:11.363877 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.363856 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b" containerName="storage-initializer" Apr 16 15:14:11.363877 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.363872 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ecd02460-5936-48c3-b169-fd8a14fe81d7" containerName="storage-initializer" Apr 16 15:14:11.363877 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.363878 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd02460-5936-48c3-b169-fd8a14fe81d7" containerName="storage-initializer" Apr 16 15:14:11.364030 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.363885 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ecd02460-5936-48c3-b169-fd8a14fe81d7" containerName="kserve-container" Apr 16 15:14:11.364030 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.363890 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd02460-5936-48c3-b169-fd8a14fe81d7" containerName="kserve-container" Apr 16 15:14:11.364030 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.363896 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71dbcd77-e56b-47d2-9890-a4892849c4b5" containerName="storage-initializer" Apr 16 15:14:11.364030 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.363901 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="71dbcd77-e56b-47d2-9890-a4892849c4b5" containerName="storage-initializer" Apr 16 15:14:11.364030 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.363910 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71dbcd77-e56b-47d2-9890-a4892849c4b5" containerName="storage-initializer" Apr 16 15:14:11.364030 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.363915 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="71dbcd77-e56b-47d2-9890-a4892849c4b5" containerName="storage-initializer" Apr 16 15:14:11.364030 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.363961 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="ecd02460-5936-48c3-b169-fd8a14fe81d7" containerName="kserve-container" Apr 16 15:14:11.364030 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.363969 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b" containerName="storage-initializer" Apr 16 15:14:11.364030 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.363976 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="71dbcd77-e56b-47d2-9890-a4892849c4b5" containerName="storage-initializer" Apr 16 15:14:11.364030 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.363982 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="71dbcd77-e56b-47d2-9890-a4892849c4b5" containerName="storage-initializer" Apr 16 15:14:11.364030 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.363989 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b" containerName="storage-initializer" Apr 16 15:14:11.368332 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.368304 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p" Apr 16 15:14:11.370353 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.370330 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-l2z86\"" Apr 16 15:14:11.377903 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.377873 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p"] Apr 16 15:14:11.392886 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.392844 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b-kserve-provision-location\") pod \"8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b\" (UID: \"8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b\") " Apr 16 15:14:11.393059 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.392901 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b-cabundle-cert\") pod \"8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b\" (UID: \"8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b\") " Apr 16 15:14:11.393159 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.393139 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/053841b9-fccf-41c9-a494-f9030a7432d7-kserve-provision-location\") pod \"raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p\" (UID: \"053841b9-fccf-41c9-a494-f9030a7432d7\") " pod="kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p" Apr 16 15:14:11.393222 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.393149 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b" (UID: "8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:14:11.393312 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.393290 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b" (UID: "8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:14:11.494687 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.494596 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/053841b9-fccf-41c9-a494-f9030a7432d7-kserve-provision-location\") pod \"raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p\" (UID: \"053841b9-fccf-41c9-a494-f9030a7432d7\") " pod="kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p" Apr 16 15:14:11.494687 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.494665 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b-kserve-provision-location\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 15:14:11.494687 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.494681 2565 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b-cabundle-cert\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 15:14:11.494987 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.494966 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/053841b9-fccf-41c9-a494-f9030a7432d7-kserve-provision-location\") pod \"raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p\" (UID: \"053841b9-fccf-41c9-a494-f9030a7432d7\") " pod="kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p" Apr 16 15:14:11.679203 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.679150 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p" Apr 16 15:14:11.802782 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:11.802756 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p"] Apr 16 15:14:11.805612 ip-10-0-142-46 kubenswrapper[2565]: W0416 15:14:11.805583 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod053841b9_fccf_41c9_a494_f9030a7432d7.slice/crio-4371bff17a8e6ce2a58dc80de77fe93b843c81a6cf3a95af98fcce233130f22d WatchSource:0}: Error finding container 4371bff17a8e6ce2a58dc80de77fe93b843c81a6cf3a95af98fcce233130f22d: Status 404 returned error can't find the container with id 4371bff17a8e6ce2a58dc80de77fe93b843c81a6cf3a95af98fcce233130f22d Apr 16 15:14:12.283601 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:12.283570 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh_8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b/storage-initializer/1.log" Apr 16 15:14:12.283791 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:12.283700 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh" Apr 16 15:14:12.283791 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:12.283700 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh" event={"ID":"8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b","Type":"ContainerDied","Data":"bb426e90ed6959aafd60a762e8d1383b059010a52f900183a5394f50f2f3fada"} Apr 16 15:14:12.283912 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:12.283829 2565 scope.go:117] "RemoveContainer" containerID="fa32207436e4d3014ed348b0e78dd2edb468416171859e6485d5b67d2e3a8544" Apr 16 15:14:12.285352 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:12.285325 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p" event={"ID":"053841b9-fccf-41c9-a494-f9030a7432d7","Type":"ContainerStarted","Data":"7981b82e9215c4f87151c795c37dfd6dbd3e91becabfdfa7f22c1ea674e393cd"} Apr 16 15:14:12.285428 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:12.285358 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p" event={"ID":"053841b9-fccf-41c9-a494-f9030a7432d7","Type":"ContainerStarted","Data":"4371bff17a8e6ce2a58dc80de77fe93b843c81a6cf3a95af98fcce233130f22d"} Apr 16 15:14:12.321443 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:12.321412 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh"] Apr 16 15:14:12.327856 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:12.327826 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8c321a-predictor-6cd87d9f66-z7rlh"] Apr 16 15:14:14.327243 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:14.327203 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b" path="/var/lib/kubelet/pods/8ab9a5f0-db9c-4c1b-ba7d-4ab756ac0d5b/volumes" Apr 16 15:14:16.301545 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:16.301507 2565 generic.go:358] "Generic (PLEG): container finished" podID="053841b9-fccf-41c9-a494-f9030a7432d7" containerID="7981b82e9215c4f87151c795c37dfd6dbd3e91becabfdfa7f22c1ea674e393cd" exitCode=0 Apr 16 15:14:16.301942 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:16.301580 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p" event={"ID":"053841b9-fccf-41c9-a494-f9030a7432d7","Type":"ContainerDied","Data":"7981b82e9215c4f87151c795c37dfd6dbd3e91becabfdfa7f22c1ea674e393cd"} Apr 16 15:14:17.305962 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:17.305901 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p" event={"ID":"053841b9-fccf-41c9-a494-f9030a7432d7","Type":"ContainerStarted","Data":"693ff6c059f46c34e4dff385960aa25b740a9d6a68af09f0ac84bb673ac8b9d9"} Apr 16 15:14:17.306427 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:17.306287 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p" Apr 16 15:14:17.307701 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:17.307671 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p" podUID="053841b9-fccf-41c9-a494-f9030a7432d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 15:14:17.325221 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:17.325151 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p" podStartSLOduration=6.32513507 podStartE2EDuration="6.32513507s" podCreationTimestamp="2026-04-16 15:14:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:14:17.323269647 +0000 UTC m=+1321.664400503" watchObservedRunningTime="2026-04-16 15:14:17.32513507 +0000 UTC m=+1321.666265925" Apr 16 15:14:18.309241 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:18.309203 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p" podUID="053841b9-fccf-41c9-a494-f9030a7432d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 15:14:28.309849 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:28.309802 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p" podUID="053841b9-fccf-41c9-a494-f9030a7432d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 15:14:38.309402 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:38.309297 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p" podUID="053841b9-fccf-41c9-a494-f9030a7432d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 15:14:48.309314 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:48.309261 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p" podUID="053841b9-fccf-41c9-a494-f9030a7432d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 15:14:58.310083 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:14:58.310024 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p" podUID="053841b9-fccf-41c9-a494-f9030a7432d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 15:15:08.309955 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:08.309904 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p" podUID="053841b9-fccf-41c9-a494-f9030a7432d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 15:15:18.309392 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:18.309346 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p" podUID="053841b9-fccf-41c9-a494-f9030a7432d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 15:15:23.323686 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:23.323656 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p" Apr 16 15:15:31.495945 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:31.495910 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p"] Apr 16 15:15:31.496370 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:31.496191 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p" podUID="053841b9-fccf-41c9-a494-f9030a7432d7" containerName="kserve-container" containerID="cri-o://693ff6c059f46c34e4dff385960aa25b740a9d6a68af09f0ac84bb673ac8b9d9" gracePeriod=30 Apr 16 15:15:31.552915 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:31.552878 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt"] Apr 16 15:15:31.556040 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:31.556026 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt" Apr 16 15:15:31.563428 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:31.563400 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt"] Apr 16 15:15:31.621875 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:31.621839 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41446b76-2e24-47db-a963-c8287e69fbbd-kserve-provision-location\") pod \"raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt\" (UID: \"41446b76-2e24-47db-a963-c8287e69fbbd\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt" Apr 16 15:15:31.723161 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:31.723122 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41446b76-2e24-47db-a963-c8287e69fbbd-kserve-provision-location\") pod \"raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt\" (UID: \"41446b76-2e24-47db-a963-c8287e69fbbd\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt" Apr 16 15:15:31.723504 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:31.723485 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41446b76-2e24-47db-a963-c8287e69fbbd-kserve-provision-location\") pod \"raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt\" (UID: \"41446b76-2e24-47db-a963-c8287e69fbbd\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt" Apr 16 15:15:31.866594 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:31.866502 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt" Apr 16 15:15:31.985819 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:31.985651 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt"] Apr 16 15:15:31.988719 ip-10-0-142-46 kubenswrapper[2565]: W0416 15:15:31.988688 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41446b76_2e24_47db_a963_c8287e69fbbd.slice/crio-033ec905183cb93a5c8e38bfa8ddda8b7a60bfd1275a165c6bdfbd3f74bdb456 WatchSource:0}: Error finding container 033ec905183cb93a5c8e38bfa8ddda8b7a60bfd1275a165c6bdfbd3f74bdb456: Status 404 returned error can't find the container with id 033ec905183cb93a5c8e38bfa8ddda8b7a60bfd1275a165c6bdfbd3f74bdb456 Apr 16 15:15:32.517038 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:32.517002 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt" event={"ID":"41446b76-2e24-47db-a963-c8287e69fbbd","Type":"ContainerStarted","Data":"a2d956a0593601ae190b0db95f4824e8e39a0124b662d9b55f7d33e1d3cc29a9"} Apr 16 15:15:32.517038 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:32.517042 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt" event={"ID":"41446b76-2e24-47db-a963-c8287e69fbbd","Type":"ContainerStarted","Data":"033ec905183cb93a5c8e38bfa8ddda8b7a60bfd1275a165c6bdfbd3f74bdb456"} Apr 16 15:15:33.324056 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:33.324013 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p" podUID="053841b9-fccf-41c9-a494-f9030a7432d7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 15:15:36.037333 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:36.037264 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p" Apr 16 15:15:36.164050 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:36.164011 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/053841b9-fccf-41c9-a494-f9030a7432d7-kserve-provision-location\") pod \"053841b9-fccf-41c9-a494-f9030a7432d7\" (UID: \"053841b9-fccf-41c9-a494-f9030a7432d7\") " Apr 16 15:15:36.164395 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:36.164367 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/053841b9-fccf-41c9-a494-f9030a7432d7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "053841b9-fccf-41c9-a494-f9030a7432d7" (UID: "053841b9-fccf-41c9-a494-f9030a7432d7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:15:36.264889 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:36.264852 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/053841b9-fccf-41c9-a494-f9030a7432d7-kserve-provision-location\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 15:15:36.529826 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:36.529735 2565 generic.go:358] "Generic (PLEG): container finished" podID="41446b76-2e24-47db-a963-c8287e69fbbd" containerID="a2d956a0593601ae190b0db95f4824e8e39a0124b662d9b55f7d33e1d3cc29a9" exitCode=0 Apr 16 15:15:36.529826 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:36.529810 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt" event={"ID":"41446b76-2e24-47db-a963-c8287e69fbbd","Type":"ContainerDied","Data":"a2d956a0593601ae190b0db95f4824e8e39a0124b662d9b55f7d33e1d3cc29a9"} Apr 16 15:15:36.531214 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:36.531189 2565 generic.go:358] "Generic (PLEG): container finished" podID="053841b9-fccf-41c9-a494-f9030a7432d7" containerID="693ff6c059f46c34e4dff385960aa25b740a9d6a68af09f0ac84bb673ac8b9d9" exitCode=0 Apr 16 15:15:36.531322 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:36.531272 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p" Apr 16 15:15:36.531442 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:36.531270 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p" event={"ID":"053841b9-fccf-41c9-a494-f9030a7432d7","Type":"ContainerDied","Data":"693ff6c059f46c34e4dff385960aa25b740a9d6a68af09f0ac84bb673ac8b9d9"} Apr 16 15:15:36.531563 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:36.531461 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p" event={"ID":"053841b9-fccf-41c9-a494-f9030a7432d7","Type":"ContainerDied","Data":"4371bff17a8e6ce2a58dc80de77fe93b843c81a6cf3a95af98fcce233130f22d"} Apr 16 15:15:36.531563 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:36.531487 2565 scope.go:117] "RemoveContainer" containerID="693ff6c059f46c34e4dff385960aa25b740a9d6a68af09f0ac84bb673ac8b9d9" Apr 16 15:15:36.539222 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:36.539204 2565 scope.go:117] "RemoveContainer" containerID="7981b82e9215c4f87151c795c37dfd6dbd3e91becabfdfa7f22c1ea674e393cd" Apr 16 15:15:36.546563 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:36.546538 2565 scope.go:117] "RemoveContainer" containerID="693ff6c059f46c34e4dff385960aa25b740a9d6a68af09f0ac84bb673ac8b9d9" Apr 16 15:15:36.547201 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:15:36.547159 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"693ff6c059f46c34e4dff385960aa25b740a9d6a68af09f0ac84bb673ac8b9d9\": container with ID starting with 693ff6c059f46c34e4dff385960aa25b740a9d6a68af09f0ac84bb673ac8b9d9 not found: ID does not exist" containerID="693ff6c059f46c34e4dff385960aa25b740a9d6a68af09f0ac84bb673ac8b9d9" Apr 16 15:15:36.547305 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:36.547209 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"693ff6c059f46c34e4dff385960aa25b740a9d6a68af09f0ac84bb673ac8b9d9"} err="failed to get container status \"693ff6c059f46c34e4dff385960aa25b740a9d6a68af09f0ac84bb673ac8b9d9\": rpc error: code = NotFound desc = could not find container \"693ff6c059f46c34e4dff385960aa25b740a9d6a68af09f0ac84bb673ac8b9d9\": container with ID starting with 693ff6c059f46c34e4dff385960aa25b740a9d6a68af09f0ac84bb673ac8b9d9 not found: ID does not exist" Apr 16 15:15:36.547305 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:36.547228 2565 scope.go:117] "RemoveContainer" containerID="7981b82e9215c4f87151c795c37dfd6dbd3e91becabfdfa7f22c1ea674e393cd" Apr 16 15:15:36.547495 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:15:36.547479 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7981b82e9215c4f87151c795c37dfd6dbd3e91becabfdfa7f22c1ea674e393cd\": container with ID starting with 7981b82e9215c4f87151c795c37dfd6dbd3e91becabfdfa7f22c1ea674e393cd not found: ID does not exist" containerID="7981b82e9215c4f87151c795c37dfd6dbd3e91becabfdfa7f22c1ea674e393cd" Apr 16 15:15:36.547557 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:36.547498 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7981b82e9215c4f87151c795c37dfd6dbd3e91becabfdfa7f22c1ea674e393cd"} err="failed to get container status \"7981b82e9215c4f87151c795c37dfd6dbd3e91becabfdfa7f22c1ea674e393cd\": rpc error: code = NotFound desc = could not find container \"7981b82e9215c4f87151c795c37dfd6dbd3e91becabfdfa7f22c1ea674e393cd\": container with ID starting with 7981b82e9215c4f87151c795c37dfd6dbd3e91becabfdfa7f22c1ea674e393cd not found: ID does not exist" Apr 16 15:15:36.557674 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:36.557629 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p"] Apr 16 15:15:36.559434 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:36.559414 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-7eb70-predictor-8458bdd6cb-ntq7p"] Apr 16 15:15:37.535598 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:37.535563 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt" event={"ID":"41446b76-2e24-47db-a963-c8287e69fbbd","Type":"ContainerStarted","Data":"914857b3b716bff0cfd30fa03a0f6580045bc80310f448aed2b0bc0980214ea8"} Apr 16 15:15:37.536047 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:37.535834 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt" Apr 16 15:15:37.537262 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:37.537230 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt" podUID="41446b76-2e24-47db-a963-c8287e69fbbd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 15:15:37.550866 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:37.550818 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt" podStartSLOduration=6.550804607 podStartE2EDuration="6.550804607s" podCreationTimestamp="2026-04-16 15:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:15:37.549349788 +0000 UTC m=+1401.890480643" watchObservedRunningTime="2026-04-16 15:15:37.550804607 +0000 UTC m=+1401.891935461" Apr 16 15:15:38.327016 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:38.326982 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="053841b9-fccf-41c9-a494-f9030a7432d7" path="/var/lib/kubelet/pods/053841b9-fccf-41c9-a494-f9030a7432d7/volumes" Apr 16 15:15:38.539953 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:38.539907 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt" podUID="41446b76-2e24-47db-a963-c8287e69fbbd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 15:15:48.540308 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:48.540266 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt" podUID="41446b76-2e24-47db-a963-c8287e69fbbd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 15:15:58.540235 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:15:58.540146 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt" podUID="41446b76-2e24-47db-a963-c8287e69fbbd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 15:16:08.540221 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:16:08.540105 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt" podUID="41446b76-2e24-47db-a963-c8287e69fbbd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 15:16:18.540385 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:16:18.540333 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt" podUID="41446b76-2e24-47db-a963-c8287e69fbbd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 15:16:28.540287 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:16:28.540239 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt" podUID="41446b76-2e24-47db-a963-c8287e69fbbd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 15:16:38.540443 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:16:38.540396 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt" podUID="41446b76-2e24-47db-a963-c8287e69fbbd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 15:16:39.323251 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:16:39.323213 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt" podUID="41446b76-2e24-47db-a963-c8287e69fbbd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 15:16:49.324404 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:16:49.324361 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt" Apr 16 15:16:51.666038 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:16:51.666003 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt"] Apr 16 15:16:51.666496 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:16:51.666254 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt" podUID="41446b76-2e24-47db-a963-c8287e69fbbd" containerName="kserve-container" containerID="cri-o://914857b3b716bff0cfd30fa03a0f6580045bc80310f448aed2b0bc0980214ea8" gracePeriod=30 Apr 16 15:16:56.305299 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:16:56.305275 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt" Apr 16 15:16:56.391057 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:16:56.390964 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41446b76-2e24-47db-a963-c8287e69fbbd-kserve-provision-location\") pod \"41446b76-2e24-47db-a963-c8287e69fbbd\" (UID: \"41446b76-2e24-47db-a963-c8287e69fbbd\") " Apr 16 15:16:56.391345 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:16:56.391317 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41446b76-2e24-47db-a963-c8287e69fbbd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "41446b76-2e24-47db-a963-c8287e69fbbd" (UID: "41446b76-2e24-47db-a963-c8287e69fbbd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:16:56.492218 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:16:56.492146 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41446b76-2e24-47db-a963-c8287e69fbbd-kserve-provision-location\") on node \"ip-10-0-142-46.ec2.internal\" DevicePath \"\"" Apr 16 15:16:56.765675 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:16:56.765641 2565 generic.go:358] "Generic (PLEG): container finished" podID="41446b76-2e24-47db-a963-c8287e69fbbd" containerID="914857b3b716bff0cfd30fa03a0f6580045bc80310f448aed2b0bc0980214ea8" exitCode=0 Apr 16 15:16:56.765856 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:16:56.765715 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt" Apr 16 15:16:56.765856 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:16:56.765723 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt" event={"ID":"41446b76-2e24-47db-a963-c8287e69fbbd","Type":"ContainerDied","Data":"914857b3b716bff0cfd30fa03a0f6580045bc80310f448aed2b0bc0980214ea8"} Apr 16 15:16:56.765856 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:16:56.765759 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt" event={"ID":"41446b76-2e24-47db-a963-c8287e69fbbd","Type":"ContainerDied","Data":"033ec905183cb93a5c8e38bfa8ddda8b7a60bfd1275a165c6bdfbd3f74bdb456"} Apr 16 15:16:56.765856 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:16:56.765774 2565 scope.go:117] "RemoveContainer" containerID="914857b3b716bff0cfd30fa03a0f6580045bc80310f448aed2b0bc0980214ea8" Apr 16 15:16:56.773844 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:16:56.773636 2565 scope.go:117] "RemoveContainer" containerID="a2d956a0593601ae190b0db95f4824e8e39a0124b662d9b55f7d33e1d3cc29a9" Apr 16 15:16:56.780967 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:16:56.780948 2565 scope.go:117] "RemoveContainer" containerID="914857b3b716bff0cfd30fa03a0f6580045bc80310f448aed2b0bc0980214ea8" Apr 16 15:16:56.781241 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:16:56.781225 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"914857b3b716bff0cfd30fa03a0f6580045bc80310f448aed2b0bc0980214ea8\": container with ID starting with 914857b3b716bff0cfd30fa03a0f6580045bc80310f448aed2b0bc0980214ea8 not found: ID does not exist" containerID="914857b3b716bff0cfd30fa03a0f6580045bc80310f448aed2b0bc0980214ea8" Apr 16 15:16:56.781296 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:16:56.781249 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"914857b3b716bff0cfd30fa03a0f6580045bc80310f448aed2b0bc0980214ea8"} err="failed to get container status \"914857b3b716bff0cfd30fa03a0f6580045bc80310f448aed2b0bc0980214ea8\": rpc error: code = NotFound desc = could not find container \"914857b3b716bff0cfd30fa03a0f6580045bc80310f448aed2b0bc0980214ea8\": container with ID starting with 914857b3b716bff0cfd30fa03a0f6580045bc80310f448aed2b0bc0980214ea8 not found: ID does not exist" Apr 16 15:16:56.781296 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:16:56.781267 2565 scope.go:117] "RemoveContainer" containerID="a2d956a0593601ae190b0db95f4824e8e39a0124b662d9b55f7d33e1d3cc29a9" Apr 16 15:16:56.781501 ip-10-0-142-46 kubenswrapper[2565]: E0416 15:16:56.781483 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2d956a0593601ae190b0db95f4824e8e39a0124b662d9b55f7d33e1d3cc29a9\": container with ID starting with a2d956a0593601ae190b0db95f4824e8e39a0124b662d9b55f7d33e1d3cc29a9 not found: ID does not exist" containerID="a2d956a0593601ae190b0db95f4824e8e39a0124b662d9b55f7d33e1d3cc29a9" Apr 16 15:16:56.781538 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:16:56.781508 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2d956a0593601ae190b0db95f4824e8e39a0124b662d9b55f7d33e1d3cc29a9"} err="failed to get container status \"a2d956a0593601ae190b0db95f4824e8e39a0124b662d9b55f7d33e1d3cc29a9\": rpc error: code = NotFound desc = could not find container \"a2d956a0593601ae190b0db95f4824e8e39a0124b662d9b55f7d33e1d3cc29a9\": container with ID starting with a2d956a0593601ae190b0db95f4824e8e39a0124b662d9b55f7d33e1d3cc29a9 not found: ID does not exist" Apr 16 15:16:56.786159 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:16:56.786135 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt"] Apr 16 15:16:56.789136 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:16:56.789111 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-d8ecb-predictor-5ffd647596-dr7xt"] Apr 16 15:16:58.326523 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:16:58.326489 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41446b76-2e24-47db-a963-c8287e69fbbd" path="/var/lib/kubelet/pods/41446b76-2e24-47db-a963-c8287e69fbbd/volumes" Apr 16 15:17:16.169066 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.169023 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xtlq6/must-gather-wj7nq"] Apr 16 15:17:16.169522 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.169316 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41446b76-2e24-47db-a963-c8287e69fbbd" containerName="storage-initializer" Apr 16 15:17:16.169522 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.169328 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="41446b76-2e24-47db-a963-c8287e69fbbd" containerName="storage-initializer" Apr 16 15:17:16.169522 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.169340 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="053841b9-fccf-41c9-a494-f9030a7432d7" containerName="storage-initializer" Apr 16 15:17:16.169522 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.169346 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="053841b9-fccf-41c9-a494-f9030a7432d7" containerName="storage-initializer" Apr 16 15:17:16.169522 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.169351 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="053841b9-fccf-41c9-a494-f9030a7432d7" containerName="kserve-container" Apr 16 15:17:16.169522 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.169357 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="053841b9-fccf-41c9-a494-f9030a7432d7" containerName="kserve-container" Apr 16 15:17:16.169522 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.169371 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41446b76-2e24-47db-a963-c8287e69fbbd" containerName="kserve-container" Apr 16 15:17:16.169522 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.169377 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="41446b76-2e24-47db-a963-c8287e69fbbd" containerName="kserve-container" Apr 16 15:17:16.169522 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.169424 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="053841b9-fccf-41c9-a494-f9030a7432d7" containerName="kserve-container" Apr 16 15:17:16.169522 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.169431 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="41446b76-2e24-47db-a963-c8287e69fbbd" containerName="kserve-container" Apr 16 15:17:16.172231 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.172214 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xtlq6/must-gather-wj7nq" Apr 16 15:17:16.174603 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.174574 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xtlq6\"/\"kube-root-ca.crt\"" Apr 16 15:17:16.174726 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.174606 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xtlq6\"/\"openshift-service-ca.crt\"" Apr 16 15:17:16.174805 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.174792 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xtlq6\"/\"default-dockercfg-sktqr\"" Apr 16 15:17:16.179783 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.179760 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xtlq6/must-gather-wj7nq"] Apr 16 15:17:16.251571 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.251532 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7d161288-8fc5-4dfb-b7b5-ae02e95f859c-must-gather-output\") pod \"must-gather-wj7nq\" (UID: \"7d161288-8fc5-4dfb-b7b5-ae02e95f859c\") " pod="openshift-must-gather-xtlq6/must-gather-wj7nq" Apr 16 15:17:16.251571 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.251576 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spdxv\" (UniqueName: \"kubernetes.io/projected/7d161288-8fc5-4dfb-b7b5-ae02e95f859c-kube-api-access-spdxv\") pod \"must-gather-wj7nq\" (UID: \"7d161288-8fc5-4dfb-b7b5-ae02e95f859c\") " pod="openshift-must-gather-xtlq6/must-gather-wj7nq" Apr 16 15:17:16.303077 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.303038 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4krl_84ace2a9-8bcc-47b5-81bb-c764aa280104/ovn-acl-logging/0.log" Apr 16 15:17:16.305841 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.305819 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4krl_84ace2a9-8bcc-47b5-81bb-c764aa280104/ovn-acl-logging/0.log" Apr 16 15:17:16.352025 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.351985 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7d161288-8fc5-4dfb-b7b5-ae02e95f859c-must-gather-output\") pod \"must-gather-wj7nq\" (UID: \"7d161288-8fc5-4dfb-b7b5-ae02e95f859c\") " pod="openshift-must-gather-xtlq6/must-gather-wj7nq" Apr 16 15:17:16.352025 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.352029 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-spdxv\" (UniqueName: \"kubernetes.io/projected/7d161288-8fc5-4dfb-b7b5-ae02e95f859c-kube-api-access-spdxv\") pod \"must-gather-wj7nq\" (UID: \"7d161288-8fc5-4dfb-b7b5-ae02e95f859c\") " pod="openshift-must-gather-xtlq6/must-gather-wj7nq" Apr 16 15:17:16.352460 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.352406 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7d161288-8fc5-4dfb-b7b5-ae02e95f859c-must-gather-output\") pod \"must-gather-wj7nq\" (UID: \"7d161288-8fc5-4dfb-b7b5-ae02e95f859c\") " pod="openshift-must-gather-xtlq6/must-gather-wj7nq" Apr 16 15:17:16.359826 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.359794 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xtlq6\"/\"kube-root-ca.crt\"" Apr 16 15:17:16.369425 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.369401 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xtlq6\"/\"openshift-service-ca.crt\"" Apr 16 15:17:16.380797 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.380768 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-spdxv\" (UniqueName: \"kubernetes.io/projected/7d161288-8fc5-4dfb-b7b5-ae02e95f859c-kube-api-access-spdxv\") pod \"must-gather-wj7nq\" (UID: \"7d161288-8fc5-4dfb-b7b5-ae02e95f859c\") " pod="openshift-must-gather-xtlq6/must-gather-wj7nq" Apr 16 15:17:16.484009 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.483922 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xtlq6\"/\"default-dockercfg-sktqr\"" Apr 16 15:17:16.492669 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.492643 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xtlq6/must-gather-wj7nq" Apr 16 15:17:16.611987 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.611941 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xtlq6/must-gather-wj7nq"] Apr 16 15:17:16.615032 ip-10-0-142-46 kubenswrapper[2565]: W0416 15:17:16.615001 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d161288_8fc5_4dfb_b7b5_ae02e95f859c.slice/crio-9bd9ef3f72a77f6bc923c3c86a66b345820a88c1ec5742b06c2de319217d166a WatchSource:0}: Error finding container 9bd9ef3f72a77f6bc923c3c86a66b345820a88c1ec5742b06c2de319217d166a: Status 404 returned error can't find the container with id 9bd9ef3f72a77f6bc923c3c86a66b345820a88c1ec5742b06c2de319217d166a Apr 16 15:17:16.822838 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:16.822746 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xtlq6/must-gather-wj7nq" event={"ID":"7d161288-8fc5-4dfb-b7b5-ae02e95f859c","Type":"ContainerStarted","Data":"9bd9ef3f72a77f6bc923c3c86a66b345820a88c1ec5742b06c2de319217d166a"} Apr 16 15:17:17.831965 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:17.831075 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xtlq6/must-gather-wj7nq" event={"ID":"7d161288-8fc5-4dfb-b7b5-ae02e95f859c","Type":"ContainerStarted","Data":"dce7e6453f5652fb97c41ab76c7e0d66cba36de703da30f310a61658fa4bb3b3"} Apr 16 15:17:17.831965 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:17.831123 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xtlq6/must-gather-wj7nq" event={"ID":"7d161288-8fc5-4dfb-b7b5-ae02e95f859c","Type":"ContainerStarted","Data":"d48e8a2136c50412da8e59ae2c502fde1cafd0be0c88d211ecefdf7f0424111e"} Apr 16 15:17:17.846389 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:17.846317 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xtlq6/must-gather-wj7nq" podStartSLOduration=1.002632175 podStartE2EDuration="1.846297347s" podCreationTimestamp="2026-04-16 15:17:16 +0000 UTC" firstStartedPulling="2026-04-16 15:17:16.616806696 +0000 UTC m=+1500.957937532" lastFinishedPulling="2026-04-16 15:17:17.460471871 +0000 UTC m=+1501.801602704" observedRunningTime="2026-04-16 15:17:17.845127308 +0000 UTC m=+1502.186258164" watchObservedRunningTime="2026-04-16 15:17:17.846297347 +0000 UTC m=+1502.187428203" Apr 16 15:17:18.903393 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:18.903321 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-gc9vp_55745b2f-83f9-46da-95cf-59aa391f6226/global-pull-secret-syncer/0.log" Apr 16 15:17:18.975695 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:18.975659 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-ldwhr_233701c2-920c-46da-8d99-9ee0fe62c01a/konnectivity-agent/0.log" Apr 16 15:17:19.046665 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:19.046640 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-46.ec2.internal_540800c0848b372f11baec71c30048e7/haproxy/0.log" Apr 16 15:17:22.703287 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:22.703257 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_259010e4-75f4-4aad-bf92-d9e608b8e229/alertmanager/0.log" Apr 16 15:17:22.725968 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:22.725939 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_259010e4-75f4-4aad-bf92-d9e608b8e229/config-reloader/0.log" Apr 16 15:17:22.756944 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:22.756751 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_259010e4-75f4-4aad-bf92-d9e608b8e229/kube-rbac-proxy-web/0.log" Apr 16 15:17:22.780627 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:22.780601 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_259010e4-75f4-4aad-bf92-d9e608b8e229/kube-rbac-proxy/0.log" Apr 16 15:17:22.806567 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:22.806541 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_259010e4-75f4-4aad-bf92-d9e608b8e229/kube-rbac-proxy-metric/0.log" Apr 16 15:17:22.827317 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:22.827279 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_259010e4-75f4-4aad-bf92-d9e608b8e229/prom-label-proxy/0.log" Apr 16 15:17:22.850115 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:22.850030 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_259010e4-75f4-4aad-bf92-d9e608b8e229/init-config-reloader/0.log" Apr 16 15:17:22.906143 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:22.906113 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-srtsh_98083ded-4c68-4825-94cf-619a9f409bd2/kube-state-metrics/0.log" Apr 16 15:17:22.925479 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:22.925452 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-srtsh_98083ded-4c68-4825-94cf-619a9f409bd2/kube-rbac-proxy-main/0.log" Apr 16 15:17:22.949229 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:22.949197 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-srtsh_98083ded-4c68-4825-94cf-619a9f409bd2/kube-rbac-proxy-self/0.log" Apr 16 15:17:23.109586 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:23.109516 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-d4tk9_c5878d8f-236e-48f5-bfaf-d655e638f782/node-exporter/0.log" Apr 16 15:17:23.133891 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:23.133858 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-d4tk9_c5878d8f-236e-48f5-bfaf-d655e638f782/kube-rbac-proxy/0.log" Apr 16 15:17:23.155951 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:23.155898 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-d4tk9_c5878d8f-236e-48f5-bfaf-d655e638f782/init-textfile/0.log" Apr 16 15:17:23.568005 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:23.567966 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-968985cc8-wmt5h_9a9b599d-95ba-4f77-ba75-7b4ef3afdc51/telemeter-client/0.log" Apr 16 15:17:23.597698 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:23.597668 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-968985cc8-wmt5h_9a9b599d-95ba-4f77-ba75-7b4ef3afdc51/reload/0.log" Apr 16 15:17:23.626994 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:23.626937 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-968985cc8-wmt5h_9a9b599d-95ba-4f77-ba75-7b4ef3afdc51/kube-rbac-proxy/0.log" Apr 16 15:17:26.248814 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.248781 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xtlq6/perf-node-gather-daemonset-bllrl"] Apr 16 15:17:26.252502 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.252477 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-bllrl" Apr 16 15:17:26.262459 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.262436 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xtlq6/perf-node-gather-daemonset-bllrl"] Apr 16 15:17:26.334468 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.334440 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzwn7\" (UniqueName: \"kubernetes.io/projected/5c4807c3-3337-4d0c-bd7d-4b180ec382c1-kube-api-access-gzwn7\") pod \"perf-node-gather-daemonset-bllrl\" (UID: \"5c4807c3-3337-4d0c-bd7d-4b180ec382c1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-bllrl" Apr 16 15:17:26.334625 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.334486 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5c4807c3-3337-4d0c-bd7d-4b180ec382c1-proc\") pod \"perf-node-gather-daemonset-bllrl\" (UID: \"5c4807c3-3337-4d0c-bd7d-4b180ec382c1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-bllrl" Apr 16 15:17:26.334625 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.334518 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5c4807c3-3337-4d0c-bd7d-4b180ec382c1-lib-modules\") pod \"perf-node-gather-daemonset-bllrl\" (UID: \"5c4807c3-3337-4d0c-bd7d-4b180ec382c1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-bllrl" Apr 16 15:17:26.334625 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.334544 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5c4807c3-3337-4d0c-bd7d-4b180ec382c1-podres\") pod \"perf-node-gather-daemonset-bllrl\" (UID: \"5c4807c3-3337-4d0c-bd7d-4b180ec382c1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-bllrl" Apr 16 15:17:26.334625 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.334564 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c4807c3-3337-4d0c-bd7d-4b180ec382c1-sys\") pod \"perf-node-gather-daemonset-bllrl\" (UID: \"5c4807c3-3337-4d0c-bd7d-4b180ec382c1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-bllrl" Apr 16 15:17:26.435285 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.435245 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzwn7\" (UniqueName: \"kubernetes.io/projected/5c4807c3-3337-4d0c-bd7d-4b180ec382c1-kube-api-access-gzwn7\") pod \"perf-node-gather-daemonset-bllrl\" (UID: \"5c4807c3-3337-4d0c-bd7d-4b180ec382c1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-bllrl" Apr 16 15:17:26.435480 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.435321 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5c4807c3-3337-4d0c-bd7d-4b180ec382c1-proc\") pod \"perf-node-gather-daemonset-bllrl\" (UID: \"5c4807c3-3337-4d0c-bd7d-4b180ec382c1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-bllrl" Apr 16 15:17:26.435480 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.435373 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5c4807c3-3337-4d0c-bd7d-4b180ec382c1-lib-modules\") pod \"perf-node-gather-daemonset-bllrl\" (UID: \"5c4807c3-3337-4d0c-bd7d-4b180ec382c1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-bllrl" Apr 16 15:17:26.435480 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.435403 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5c4807c3-3337-4d0c-bd7d-4b180ec382c1-podres\") pod \"perf-node-gather-daemonset-bllrl\" (UID: \"5c4807c3-3337-4d0c-bd7d-4b180ec382c1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-bllrl" Apr 16 15:17:26.435480 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.435439 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c4807c3-3337-4d0c-bd7d-4b180ec382c1-sys\") pod \"perf-node-gather-daemonset-bllrl\" (UID: \"5c4807c3-3337-4d0c-bd7d-4b180ec382c1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-bllrl" Apr 16 15:17:26.435480 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.435450 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5c4807c3-3337-4d0c-bd7d-4b180ec382c1-proc\") pod \"perf-node-gather-daemonset-bllrl\" (UID: \"5c4807c3-3337-4d0c-bd7d-4b180ec382c1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-bllrl" Apr 16 15:17:26.435703 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.435517 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c4807c3-3337-4d0c-bd7d-4b180ec382c1-sys\") pod \"perf-node-gather-daemonset-bllrl\" (UID: \"5c4807c3-3337-4d0c-bd7d-4b180ec382c1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-bllrl" Apr 16 15:17:26.435703 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.435530 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5c4807c3-3337-4d0c-bd7d-4b180ec382c1-lib-modules\") pod \"perf-node-gather-daemonset-bllrl\" (UID: \"5c4807c3-3337-4d0c-bd7d-4b180ec382c1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-bllrl" Apr 16 15:17:26.435703 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.435532 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5c4807c3-3337-4d0c-bd7d-4b180ec382c1-podres\") pod \"perf-node-gather-daemonset-bllrl\" (UID: \"5c4807c3-3337-4d0c-bd7d-4b180ec382c1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-bllrl" Apr 16 15:17:26.442924 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.442895 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzwn7\" (UniqueName: \"kubernetes.io/projected/5c4807c3-3337-4d0c-bd7d-4b180ec382c1-kube-api-access-gzwn7\") pod \"perf-node-gather-daemonset-bllrl\" (UID: \"5c4807c3-3337-4d0c-bd7d-4b180ec382c1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-bllrl" Apr 16 15:17:26.564616 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.564524 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-bllrl" Apr 16 15:17:26.693759 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.693727 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xtlq6/perf-node-gather-daemonset-bllrl"] Apr 16 15:17:26.697275 ip-10-0-142-46 kubenswrapper[2565]: W0416 15:17:26.697238 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5c4807c3_3337_4d0c_bd7d_4b180ec382c1.slice/crio-34ed981b945055d27c04042c094c2fe010e6adf790028200deff59f958fa0801 WatchSource:0}: Error finding container 34ed981b945055d27c04042c094c2fe010e6adf790028200deff59f958fa0801: Status 404 returned error can't find the container with id 34ed981b945055d27c04042c094c2fe010e6adf790028200deff59f958fa0801 Apr 16 15:17:26.698869 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.698844 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:17:26.729277 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.729249 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-m9hrv_927790a5-7672-4d93-a725-5924ae587d09/dns/0.log" Apr 16 15:17:26.748365 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.748345 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-m9hrv_927790a5-7672-4d93-a725-5924ae587d09/kube-rbac-proxy/0.log" Apr 16 15:17:26.792268 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.792236 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-f4596_cdb087ca-e5b6-43aa-88b4-f2d25147cf7e/dns-node-resolver/0.log" Apr 16 15:17:26.867621 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.866654 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-bllrl" event={"ID":"5c4807c3-3337-4d0c-bd7d-4b180ec382c1","Type":"ContainerStarted","Data":"0ef26a004218ff4ca32e40a6bff9a995f6d38c007f5555e8105e56ef94a0c9e1"} Apr 16 15:17:26.867621 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.866698 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-bllrl" event={"ID":"5c4807c3-3337-4d0c-bd7d-4b180ec382c1","Type":"ContainerStarted","Data":"34ed981b945055d27c04042c094c2fe010e6adf790028200deff59f958fa0801"} Apr 16 15:17:26.867621 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.867523 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-bllrl" Apr 16 15:17:26.882964 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:26.882911 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-bllrl" podStartSLOduration=0.88289547 podStartE2EDuration="882.89547ms" podCreationTimestamp="2026-04-16 15:17:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:17:26.88120316 +0000 UTC m=+1511.222334016" watchObservedRunningTime="2026-04-16 15:17:26.88289547 +0000 UTC m=+1511.224026324" Apr 16 15:17:27.252055 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:27.252024 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fpnf7_40f100bd-4ea9-4c4c-bfe7-d00fbe4359f5/node-ca/0.log" Apr 16 15:17:28.262072 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:28.262021 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4qbwg_9a14533b-b916-4308-8775-7107db9fe6de/serve-healthcheck-canary/0.log" Apr 16 15:17:28.725005 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:28.724975 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-chpgs_7c771f50-4b0e-4280-b47b-81da44e68d3d/kube-rbac-proxy/0.log" Apr 16 15:17:28.744730 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:28.744706 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-chpgs_7c771f50-4b0e-4280-b47b-81da44e68d3d/exporter/0.log" Apr 16 15:17:28.764116 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:28.764089 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-chpgs_7c771f50-4b0e-4280-b47b-81da44e68d3d/extractor/0.log" Apr 16 15:17:30.705145 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:30.705120 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-f4zjc_230f084b-c0f2-4810-894d-b3dc1ebf1291/server/0.log" Apr 16 15:17:30.832088 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:30.832045 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-7cvhl_a61e3784-af03-4363-9a93-d6e2613fb991/seaweedfs/0.log" Apr 16 15:17:33.886466 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:33.885571 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-bllrl" Apr 16 15:17:35.858861 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:35.858764 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4mtb6_5172b522-bc83-41f2-8760-e2fba5340ff1/kube-multus-additional-cni-plugins/0.log" Apr 16 15:17:35.879837 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:35.879805 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4mtb6_5172b522-bc83-41f2-8760-e2fba5340ff1/egress-router-binary-copy/0.log" Apr 16 15:17:35.902821 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:35.902793 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4mtb6_5172b522-bc83-41f2-8760-e2fba5340ff1/cni-plugins/0.log" Apr 16 15:17:35.923781 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:35.923752 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4mtb6_5172b522-bc83-41f2-8760-e2fba5340ff1/bond-cni-plugin/0.log" Apr 16 15:17:35.948125 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:35.948091 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4mtb6_5172b522-bc83-41f2-8760-e2fba5340ff1/routeoverride-cni/0.log" Apr 16 15:17:35.969369 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:35.969345 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4mtb6_5172b522-bc83-41f2-8760-e2fba5340ff1/whereabouts-cni-bincopy/0.log" Apr 16 15:17:35.991240 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:35.991210 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4mtb6_5172b522-bc83-41f2-8760-e2fba5340ff1/whereabouts-cni/0.log" Apr 16 15:17:36.371263 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:36.371219 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jf55j_5bd34617-f0f5-4b74-b464-a7613ad4c7a9/kube-multus/0.log" Apr 16 15:17:36.473126 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:36.473096 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-j76vn_38d86a56-d8b6-4bb2-a413-3166ca14717f/network-metrics-daemon/0.log" Apr 16 15:17:36.492527 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:36.492502 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-j76vn_38d86a56-d8b6-4bb2-a413-3166ca14717f/kube-rbac-proxy/0.log" Apr 16 15:17:37.518251 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:37.518224 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4krl_84ace2a9-8bcc-47b5-81bb-c764aa280104/ovn-controller/0.log" Apr 16 15:17:37.536654 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:37.536625 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4krl_84ace2a9-8bcc-47b5-81bb-c764aa280104/ovn-acl-logging/0.log" Apr 16 15:17:37.545361 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:37.545327 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4krl_84ace2a9-8bcc-47b5-81bb-c764aa280104/ovn-acl-logging/1.log" Apr 16 15:17:37.565976 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:37.565938 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4krl_84ace2a9-8bcc-47b5-81bb-c764aa280104/kube-rbac-proxy-node/0.log" Apr 16 15:17:37.589130 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:37.589098 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4krl_84ace2a9-8bcc-47b5-81bb-c764aa280104/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 15:17:37.607239 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:37.607158 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4krl_84ace2a9-8bcc-47b5-81bb-c764aa280104/northd/0.log" Apr 16 15:17:37.629116 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:37.629086 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4krl_84ace2a9-8bcc-47b5-81bb-c764aa280104/nbdb/0.log" Apr 16 15:17:37.661055 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:37.661027 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4krl_84ace2a9-8bcc-47b5-81bb-c764aa280104/sbdb/0.log" Apr 16 15:17:37.776990 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:37.776956 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4krl_84ace2a9-8bcc-47b5-81bb-c764aa280104/ovnkube-controller/0.log" Apr 16 15:17:38.950137 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:38.950110 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-d7tkp_82d43552-0266-40be-b011-548c6b1da18a/network-check-target-container/0.log" Apr 16 15:17:39.874732 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:39.874703 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-5zt6p_fd96140b-7f7e-4208-9c8d-400e5a881b11/iptables-alerter/0.log" Apr 16 15:17:40.505159 ip-10-0-142-46 kubenswrapper[2565]: I0416 15:17:40.505127 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-cj56j_c578c137-c4e7-4fd5-8394-2a72b0661d12/tuned/0.log"